Compare commits

...

94 Commits

Author SHA1 Message Date
hailin 1f15daa6c5 fix(planting-records): filter only MINING_ENABLED records and fix UI overflow
- Backend: Add status filter to getPlantingLedger and getPlantingSummary
- Frontend: Change Row to Wrap for info items to prevent width overflow

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 00:12:07 -08:00
hailin 8ae9e217ff fix(mining-app): fix mining records data parsing from mining-service
Map miningMinute->distributionMinute, minedAmount->shareAmount,
secondDistribution->priceSnapshot to match entity fields

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 00:02:30 -08:00
hailin 12f8fa67fc feat(mining-admin): add totalTrees, separate level/bonus pending display
- Add totalTrees field from syncedAdoption aggregate
- Rename fields: networkLevelPending, networkBonusPending
- Stats card: show level pending and bonus pending separately
- Add new stats card for total trees count
- Price overview: 2-row layout showing all contribution metrics

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:59:32 -08:00
hailin b310fde426 feat(mining-admin): show pending contribution in dashboard
- Add networkPendingContribution and networkBonusPendingContribution to API
- Display combined pending contribution (teamLevel + teamBonus) in stats card
- Replace 'total contribution' with 'pending contribution' in price overview

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:46:05 -08:00
hailin 81a58edaca fix(contribution-service): calculate totalContribution correctly in CDC event
Previously, totalContribution was incorrectly set to effectiveContribution.
Now correctly calculated as: personal + teamLevel + teamBonus

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:40:50 -08:00
hailin debc8605df fix(mining-app): rename MiningRecordsPage widget to avoid name conflict
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:33:38 -08:00
hailin dee9c511e5 feat(mining-admin): add total contribution to dashboard stats
- Add networkTotalContribution field to dashboard API response
- Display total hashrate alongside effective hashrate in stats cards
- Update price overview to show both effective and total contribution
- Change grid from 3 to 4 columns in price overview

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:32:29 -08:00
hailin 546c0060da feat(mining-app): add mining records and planting records pages
- Add mining records page showing distribution history with share amounts
- Add planting records page with adoption summary and detailed records
- Remove 推广奖励 and 收益明细 from profile page
- Add planting-ledger API endpoint and data models

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:23:31 -08:00
hailin b81ae634a6 fix(mining-app): hardcode team bonus tiers display to 15
- Profile page: 团队上级 shows '15' instead of actual unlockedBonusTiers
- Contribution page: 已解锁上级 shows '15级' instead of actual value

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 20:28:02 -08:00
hailin 0cccc0e2cd refactor(mining-app): rename VIP等级 to 团队上级 and 直推人数 to 引荐人数
- Changed "VIP等级" label to "团队上级" in profile stats row
- Changed display value from vipLevel (V3 format) to unlockedBonusTiers (raw number)
- Changed "直推人数" label to "引荐人数" for consistency

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 20:07:52 -08:00
hailin cd938f4a34 refactor(mining-app): rename team contribution labels
Update contribution page labels:
- "团队层级" → "团队下级"
- "团队奖励" → "团队上级"
- "直推人数" → "引荐人数"
- "已解锁奖励" → "已解锁上级" (with unit "档" → "级")
- "已解锁层级" → "已解锁下级"
- "直推及间推" → "引荐及间推" in subtitle

Update contribution records page labels:
- "团队层级" → "团队下级"
- "团队奖励" → "团队上级"

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:58:41 -08:00
hailin 84fa3e5e19 refactor(mining-app): rename 绿积分 to 积分值 across all pages
Replace all occurrences of "绿积分" with "积分值" in:
- trading_page.dart (price display, pool name, input field)
- asset_page.dart (account labels)
- trading_account.dart (entity comment)
- price_info.dart (entity comment)
- market_overview.dart (entity comment)
- DEVELOPMENT_GUIDE.md (documentation)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:57:17 -08:00
hailin adeeadb495 fix(mining-app): update profile page - hide items and rename label
- Rename "团队层级" to "团队下级" in stats row
- Hide "实名认证" option from account settings
- Hide "我的邀请码" card section entirely
- Remove unused _buildInvitationCard and _buildActionButton methods

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:47:14 -08:00
hailin 42a28efe74 fix(mining-app): remove operator account note from expiration card
Remove the "运营账号贡献值永不失效" note from the contribution
expiration countdown card.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:28:31 -08:00
hailin 91b8cca41c feat(mining-app): implement hide/show amounts toggle
- Add hideAmountsProvider to control amount visibility
- Add tap handler to eye icon in total contribution card
- Toggle icon between visibility_outlined and visibility_off_outlined
- Hide amounts with **** when toggled in:
  - Total contribution value
  - Three column stats (personal, team level, team bonus)
  - Today's estimated earnings
  - Contribution detail summary rows

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:22:03 -08:00
hailin 02cc79d67a fix(mining-app): reduce bottom padding on navigation pages
Reduce bottom SizedBox from 100 to 24 on all four main navigation
pages (contribution, trading, asset, profile) to eliminate excessive
whitespace when scrolling to bottom.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:17:59 -08:00
hailin 7bc8547a96 fix(mining-app): rename ContributionRecordsListPage to avoid name conflict
- Rename page class from ContributionRecordsPage to ContributionRecordsListPage
- Add typedef RecordsPageData for ContributionRecordsPage data model
- Fix import statements and unused variable

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:08:09 -08:00
hailin caffb124d2 feat(mining-app): add contribution records page with category summary
- Create contribution_records_page.dart with full list view
  - Pagination support with page navigation
  - Filter by source type (personal, team level, team bonus)
  - Show detailed info: tree count, base contribution, rate, amount
  - Display effective/expire dates and status badges

- Update contribution_page.dart detail card
  - Show category summary instead of record list
  - Display three categories with icons: personal, team level, team bonus
  - Add navigation to full records page via "查看全部"

- Add route configuration for /contribution-records

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:02:30 -08:00
hailin 141db46356 fix(contribution-service): use real contributionPerTree from rate service
Previously, adoptions were synced with hardcoded contributionPerTree=1,
resulting in contribution values like 0.7 instead of the expected 15831.9.

Now the handler fetches the actual contribution rate from ContributionRateService
based on the adoption date, storing values like:
- Personal (70%): 22617 × 70% = 15831.9
- Team Level (0.5%): 22617 × 0.5% = 113.085
- Team Bonus (2.5%): 22617 × 2.5% = 565.425

Note: Historical data may need migration to apply the correct multiplier.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 18:01:30 -08:00
hailin f57b0f9c26 chore(mining-app): configure release build
- Add kDebugMode check to LoggingInterceptor to suppress logs in release
- Remove debug print statements from contribution_providers
- Add Play Core proguard rules to fix R8 missing classes error

Build command: flutter build apk --release --split-per-abi --target-platform android-arm,android-arm64
Output:
- app-arm64-v8a-release.apk: 18MB
- app-armeabi-v7a-release.apk: 16MB

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 17:24:42 -08:00
hailin c852f24a72 fix(auth-service): add 'auth/' prefix to controller routes for Kong compatibility
Kong routes /api/v2/auth/* to auth-service without stripping the path,
so controllers need 'auth/' prefix to match frontend requests:
- SmsController: 'sms' -> 'auth/sms'
- PasswordController: 'password' -> 'auth/password'
- UserController: 'user' -> 'auth/user'

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 08:53:48 -08:00
hailin cb3c7623dc fix(mining-app): fix Riverpod ref usage in router redirect callback
Use cached auth state from AuthNotifier instead of ref.read() to avoid
"Cannot use ref functions after provider changed" exception during rebuild.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 08:49:52 -08:00
hailin f2692a50ed fix(contribution-service): fix toRecordDto using wrong property name
- Changed `record.finalContribution` to `record.amount` for getting final contribution value
- Added optional chaining to prevent undefined errors
- Added default values for safety

The ContributionRecordAggregate uses `amount` property, not `finalContribution`.
This was causing "Cannot read properties of undefined (reading 'value')" errors.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 08:43:14 -08:00
hailin ed9f817fae feat(mining-app): add estimated earnings and contribution stats API
- Add ContributionStats entity and model for network-wide statistics
- Add /api/v2/contribution/stats endpoint
- Implement estimatedEarningsProvider to calculate daily earnings
- Formula: (user contribution / total contribution) × daily allocation
- Update contribution page to display real estimated earnings
- Add debug logs for contribution records API

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 08:37:30 -08:00
hailin 6bcb4af028 feat(mining-app): integrate real APIs for Asset and Profile pages
- Asset page now uses trading-service /asset/my endpoint
- Profile page integrates auth-service /user/profile and contribution-service
- Add new entities: AssetDisplay, PriceInfo, MarketOverview, TradingAccount
- Add corresponding models with JSON parsing
- Create asset_providers and profile_providers for state management
- Update trading_providers with real API integration
- Extend UserState and UserInfo with additional profile fields
- Remove obsolete buy_shares and sell_shares use cases
- Fix compilation errors in get_current_price and trading_page

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 08:22:40 -08:00
hailin 106a287260 fix(mining-service): make health endpoints public 2026-01-14 07:35:42 -08:00
hailin 30dc2f6665 fix(trading-service): make health endpoints public 2026-01-14 07:28:24 -08:00
hailin e1fb70e2ee feat(trading-service): add burn system, Kafka events, and idempotency
- Add trading burn system with black hole, share pool, and price calculation
- Implement per-minute auto burn and sell burn with multiplier
- Add Kafka event publishing via outbox pattern (order, trade, burn events)
- Add user.registered consumer to auto-create trading accounts
- Implement Redis + DB dual idempotency for event processing
- Add price, burn, and asset API controllers
- Add migrations for burn tables and processed events

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 07:15:41 -08:00
hailin f3d4799efc feat(mining-wallet): add UserWalletCreated/Updated events for CDC sync
- Publish UserWalletCreated when a new wallet is created
- Publish UserWalletUpdated when wallet balance changes
- Events sent to cdc.mining-wallet.outbox topic for mining-admin-service

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 06:13:34 -08:00
hailin 839feab97d fix(mining-admin): handle CONTRIBUTION_CREDITED event for wallet sync
Add handler for CONTRIBUTION_CREDITED events from mining-wallet-service
to sync user wallet data to synced_user_wallets table.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 06:11:49 -08:00
hailin 465e398040 fix(mining-admin): fix wallet ledger API to match frontend expected format
- Return usdtAvailable, usdtFrozen, pendingUsdt, settleableUsdt,
  settledTotalUsdt, expiredTotalUsdt instead of old field names
- Query SyncedUserWallet table for GREEN_POINTS wallet data
- Use miningAccount.availableBalance for pendingUsdt

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 05:56:24 -08:00
hailin c6c875849a fix(mining-service): make mining API public for service-to-service calls
Add @Public() decorator to MiningController to allow mining-admin-service
to fetch mining records without authentication.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 05:46:11 -08:00
hailin ce95c40c84 fix(mining-service): listen to correct CDC topic for contribution sync
Changed event handler to:
- Listen to 'cdc.contribution.outbox' topic (CDC/Debezium format)
- Handle 'ContributionAccountUpdated' events instead of 'ContributionCalculated'
- Use effectiveContribution for mining power calculation

This fixes the issue where mining accounts had zero totalContribution
because they weren't receiving contribution sync events.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 05:30:38 -08:00
hailin e6d966e89f fix(mining-admin): fetch mining records from mining-service
Update getUserMiningRecords to call mining-service API instead of
returning empty records. This enables the admin dashboard to display
actual user mining records.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 05:14:03 -08:00
hailin 270c17829e fix(mining-admin-service): move mining routes before :category/:key parameter route
NestJS matches routes in definition order. The :category/:key route was
matching mining/status before the specific mining routes. Moved mining
routes before the parameter routes to fix routing.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:57:25 -08:00
hailin 289ac0190c fix(mining-admin-service): add logging and fix null data handling in getMiningStatus
- Add debug logging to trace mining service calls
- Return error object instead of null when data is missing
- Include error message in response for debugging

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:42:01 -08:00
hailin 467d637ccc fix(mining-admin-web): prevent duplicate /api/v2 in rewrite destination
Clean NEXT_PUBLIC_API_URL to remove trailing /api/v2 if present,
preventing paths like /api/v2/api/v2/configs/mining/status

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:37:32 -08:00
hailin c9690b0d36 Revert "fix(mining-admin-web): always use /api proxy instead of direct API URL"
This reverts commit 7a65ab3319.
2026-01-14 04:34:22 -08:00
hailin 7a65ab3319 fix(mining-admin-web): always use /api proxy instead of direct API URL
Browser cannot access Docker internal URLs like http://mining-admin-service:3023.
Always use /api which is proxied by Next.js rewrites to the backend service.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:32:59 -08:00
hailin e99b5347da feat(mining-admin-service): add transfer-enabled API endpoints
Add GET and POST /configs/transfer-enabled endpoints to control
the transfer switch. Routes are placed before :category/:key to
avoid being matched as path parameters.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:22:11 -08:00
hailin 29dd1affe1 fix(mining-admin-web): extract data from response wrapper
mining-admin-service uses TransformInterceptor which wraps all responses
with { success, data, timestamp } format. Frontend needs to access
response.data.data to get the actual data.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:18:51 -08:00
hailin a15dcafc03 fix(mining-admin-service): 解包mining-service返回的data字段 2026-01-14 04:09:02 -08:00
hailin d404521841 fix(mining-admin-service): 修复mining-service API路径为v2 2026-01-14 03:58:02 -08:00
hailin 09b15da3cb fix(mining-service): Redis锁使用毫秒PX代替秒EX支持小数TTL 2026-01-14 03:52:22 -08:00
hailin 901247366d fix(mining-service): 添加tsconfig include/exclude配置修复构建 2026-01-14 03:48:18 -08:00
hailin 0abc04b9cb fix(mining-service): 添加Dockerfile构建验证步骤 2026-01-14 03:45:51 -08:00
hailin 2b083991d0 feat(mining-service): 添加migration将minuteDistribution改为secondDistribution
支持每秒挖矿分配功能

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 03:40:41 -08:00
hailin 8f616dd45b fix(mining-service): 修复Dockerfile支持prisma seed
- 添加ts-node/typescript到生产环境以支持seed执行
- 启动脚本中添加prisma db seed执行
- 复制tsconfig.json到生产环境

参考mining-wallet-service的Dockerfile配置

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 03:35:34 -08:00
hailin 1008672af9 Revert "fix(mining-service): 修复Docker构建问题"
This reverts commit f4380604d9.
2026-01-14 03:34:58 -08:00
hailin f4380604d9 fix(mining-service): 修复Docker构建问题
- tsconfig.json 添加 include/exclude 排除 prisma 文件夹
- 添加 .dockerignore 排除 seed.ts
- Dockerfile 添加构建验证

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 03:34:04 -08:00
hailin 3b61f2e095 feat(mining): 实现每秒挖矿分配系统
核心改动:
- 调度器从每分钟改为每秒执行,用户每秒看到挖矿收益
- 每秒更新账户余额,但MiningRecord每分钟汇总写入一次(减少数据量)
- seed自动执行(prisma.seed配置),初始化后isActive=false
- 只有一个手动操作:管理员在后台点击"启动挖矿"

技术细节:
- 每秒分配量:100万/63,072,000秒 ≈ 0.01585 shares/秒
- Redis累积器:每秒挖矿数据累积到Redis,每分钟末写入数据库
- 分布式锁:0.9秒锁定时间,支持多实例部署
- 后台管理界面:添加挖矿状态卡片和激活/停用按钮

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 03:25:47 -08:00
hailin 25608babd6 feat(mining-service): add initialization APIs and seed script
Add admin endpoints:
- GET /admin/status - Get mining system status
- POST /admin/initialize - Initialize mining config (one-time)
- POST /admin/activate - Activate mining distribution

Add prisma seed script for database initialization:
- MiningConfig: 100.02B total shares, 200万 distribution pool
- BlackHole: 100亿 burn target
- MiningEra: First era with 100万 distribution
- PoolAccounts: SHARE_POOL, BLACK_HOLE_POOL, CIRCULATION_POOL

Based on requirements:
- 第一个两年分配100万积分股
- 第二个两年分配50万积分股(减半)
- 100亿通过10年销毁到黑洞

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:36:52 -08:00
hailin bd0f98cfb3 fix(mining-admin-web): fix audit logs page crash
- Use 'all' instead of empty string for SelectItem value (Radix requirement)
- Add null safety for items array with fallback to empty array
- Fix potential undefined access on data.items

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:30:07 -08:00
hailin a2adddbf3d fix(mining-admin): transform dashboard API response to match frontend expected format
Frontend expects flat DashboardStats and RealtimeData interfaces.
Transform backend nested response to:
- totalUsers, adoptedUsers, networkEffectiveContribution, etc.
- currentMinuteDistribution, activeOrders, pendingTrades, etc.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:23:54 -08:00
hailin d6064294d7 refactor(mining-admin): remove initialization feature
System initialization is now handled by seed scripts and CDC sync,
so the manual initialization UI is no longer needed.

Removed:
- Frontend: initialization page and sidebar menu item
- Backend: InitializationController and InitializationService

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:22:23 -08:00
hailin 36c3ada6a6 fix(mining-admin): fix audit logs API path and response format
- Change controller path from /audit-logs to /audit to match frontend
- Transform response to frontend expected format (items, totalPages, etc.)
- Map admin.username to adminUsername field
- Add keyword query parameter support

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:18:53 -08:00
hailin 13e94db450 feat(mining-admin): add /reports/daily endpoint for frontend reports page
Add ReportsController with /reports/daily endpoint that maps the
dashboard service data to the format expected by the frontend.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:10:47 -08:00
hailin feb871bcf1 feat(mining-admin): add daily report generation service
Add DailyReportService that:
- Generates daily reports on startup
- Updates reports every hour
- Collects stats from synced tables (users, adoptions, contributions, mining, trading)
- Supports historical report generation for backfilling

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:03:21 -08:00
hailin 4292d5da66 fix(mining-admin-web): fix TypeScript type for empty mainPools array 2026-01-14 01:55:58 -08:00
hailin a7a2282ba7 fix(mining-admin-web): update account type categorization to match backend
Update categorizeAccounts to use correct account types returned by backend:
- Core accounts: HEADQUARTERS, OPERATION, FEE
- Region accounts: PROVINCE, CITY

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 01:53:11 -08:00
hailin fa6826dde3 fix(mining-admin): use CDC synced tables for system accounts API
Change SystemAccountsService to read from syncedWalletSystemAccount and
syncedWalletPoolAccount tables instead of local tables. This fixes the
issue where the frontend shows "暂无数据" despite data being synced.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 01:44:22 -08:00
hailin eff71a6b22 feat(mining-wallet): publish outbox events for system/pool accounts
Add WalletSystemAccountCreated and WalletPoolAccountCreated events:
- seed.ts: publish events when creating HQ/OP/FEE and pool accounts
- contribution-wallet.service.ts: publish events when auto-creating
  province/city system accounts

This enables mining-admin-service to sync system accounts via CDC.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 01:28:48 -08:00
hailin 0bbb52284c fix(contribution): avoid nested transaction timeout in BonusClaimService
Use unitOfWork.isInTransaction() to detect if already in a transaction
context (called from ContributionCalculationService). If so, reuse the
existing transaction instead of opening a new one, preventing Prisma
interactive transaction timeout errors.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 01:02:08 -08:00
hailin 7588d18fff fix(mining-wallet): fix province/city creation and add seed on startup
- Use provinceCode directly instead of inferring from cityCode
- Use code as name for province/city records
- Add ts-node to production for seed execution
- Run prisma db seed on container startup

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 00:40:49 -08:00
hailin e6e44d9a43 Revert "fix(mining-wallet): auto-create HEADQUARTERS account, skip DEFAULT province/city"
This reverts commit bf004bab52.
2026-01-14 00:19:12 -08:00
hailin bf004bab52 fix(mining-wallet): auto-create HEADQUARTERS account, skip DEFAULT province/city 2026-01-14 00:18:53 -08:00
hailin a03b883350 fix(mining-wallet): exclude prisma directory from TypeScript compilation 2026-01-14 00:07:58 -08:00
hailin 2a79c83715 feat(contribution): implement TEAM_BONUS backfill when unlock conditions met
When a user's direct referral count reaches 2 or 4, the system now automatically
backfills previously pending TEAM_BONUS (T2/T3) contributions that were allocated
to headquarters while waiting for unlock conditions.

- Add BonusClaimService for handling bonus backfill logic
- Add findPendingBonusByAccountSequence and claimBonusRecords to repository
- Integrate bonus claim into updateReferrerUnlockStatus flow
- Add BonusClaimed event consumer in mining-wallet-service
- Generate ledger records for backfilled contributions

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 23:58:54 -08:00
hailin ef330a2687 feat(mining-wallet): add seed and auto-create province/city accounts
- Add prisma seed to initialize core system accounts (HQ, OP, FEE) and pool accounts
- Auto-create province/city system accounts on-demand during contribution distribution
- Province/city regions are also auto-created if not exist

This ensures:
1. Core accounts exist after deployment (via seed)
2. Province/city accounts are created dynamically as orders come in

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 23:36:31 -08:00
hailin 6594845d4c fix(mining-wallet): fix Kafka consumers not subscribing to topics
- Change consumers from @Injectable to @Controller for @EventPattern to work
- Move consumers from providers to controllers array in module
- Add subscribe.fromBeginning config to Kafka microservice

The consumers were not receiving messages because NestJS microservices
require @EventPattern handlers to be in @Controller classes, not just
@Injectable services.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 23:31:31 -08:00
hailin 77b682c8a8 feat(mining-wallet): make initialize endpoints public for internal network calls
Changed system-accounts/initialize and pool-accounts/initialize endpoints from
@AdminOnly to @Public to allow deploy scripts to call them without authentication.
These endpoints are only accessible from internal network.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 23:22:17 -08:00
hailin 6ec79a6672 fix(deploy): correct CDC sync API URL path
Change from /health/cdc-sync to /api/v2/health/cdc-sync

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 22:26:32 -08:00
hailin 631fe2bf31 fix(contribution-service): reset consumer group offsets to earliest on startup
Use admin.resetOffsets({ earliest: true }) before connecting consumer
to ensure CDC sync always starts from the beginning of Kafka topics,
regardless of previously committed offsets.

This fixes the infinite loop issue where existing consumer groups
had committed offsets at high watermark, causing eachMessage to
never be called.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 22:14:51 -08:00
hailin d968efcad4 fix(contribution): run CDC sync in background to allow API access during sync
Change CDC consumer startup from blocking await to non-blocking .then()
so HTTP server starts immediately and /health/cdc-sync API is accessible
for deploy script to poll sync status.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:50:59 -08:00
hailin 5a4970d7d9 Revert "fix(contribution): run CDC sync in background to avoid blocking service startup"
This reverts commit 703c12e9f6.
2026-01-13 21:44:18 -08:00
hailin 703c12e9f6 fix(contribution): run CDC sync in background to avoid blocking service startup
- Change await to .then() for cdcConsumer.start()
- Allows HTTP endpoints to be accessible during CDC sync

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:44:00 -08:00
hailin 8199bc4d66 feat(contribution): add CDC sync status API and fix deploy script timing
- Add initialSyncCompleted flag to track CDC sequential sync completion
- Add getSyncStatus() method to CDCConsumerService
- Add /health/cdc-sync endpoint to expose sync status
- Update deploy-mining.sh to wait for CDC sync completion before calling publish APIs

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:34:58 -08:00
hailin aef6feb2cd fix(contribution): use unique consumer group id for each phase
Previous consumer group had already consumed messages, so fromBeginning
had no effect. Now using timestamp-based unique group id to ensure
fresh consumption from beginning each time.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:11:40 -08:00
hailin 22523aba14 revert: restore blocking await for sequential CDC consumption
The previous change was wrong - running sequential consumption in
background defeats its purpose. The whole point is to ensure data
dependency order (users -> referrals -> adoptions) before any other
operations can proceed.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:07:57 -08:00
hailin a01fd3aa86 fix(contribution): run sequential CDC consumption in background
Prevents blocking NestJS onModuleInit during CDC sync by running
the sequential consumption in the background with error handling.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:07:11 -08:00
hailin d58e8b44ee feat(contribution): implement sequential CDC topic consumption
Implements sequential phase consumption to ensure correct data sync order:
1. User accounts (first)
2. Referral relationships (depends on users)
3. Planting orders (depends on users and referrals)

Each phase must complete before the next starts, guaranteeing 100%
reliable data dependency ordering. After all phases complete, switches
to continuous parallel consumption for real-time updates.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:57:24 -08:00
hailin 30949af577 revert: undo unauthorized ancestor_path and setDirectReferralAdoptedCount changes
Reverts commits:
- 1fbb88f7: setDirectReferralAdoptedCount change
- 471702d5: ancestor_path chain building change

These changes were made without authorization. The original code was correct.
MINING_ENABLED filtering (from dbf97ae4) is preserved.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:46:41 -08:00
hailin 1fbb88f773 fix(contribution): use setDirectReferralAdoptedCount for accurate count update
Changed updateReferrerUnlockStatus to:
1. Create account if not exists (for full-reset scenarios)
2. Use setDirectReferralAdoptedCount instead of increment loop
3. This ensures the count is always accurate regardless of processing order

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:29:53 -08:00
hailin 5eae4464ef fix(mining-app): remove unnecessary token refresh on app startup
Users were being redirected to login page when clicking navigation
because the background token refresh was failing and clearing user state.

Token refresh should only happen when API returns 401, not on every app launch.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:28:07 -08:00
hailin d43a70de93 feat(mining-admin): implement complete system accounts feature
- Add system account types and display metadata
- Create API layer with getList and getSummary endpoints
- Add React Query hooks for data fetching
- Create AccountCard, AccountsTable, SummaryCards components
- Refactor page with tabs, refresh button, and error handling
- Add Alert UI component

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:27:59 -08:00
hailin 471702d562 fix(contribution): use ancestor_path to build upline chain for TEAM_LEVEL distribution
Root cause: CDC sync order issue caused referrerAccountSequence to be null,
resulting in empty ancestor chain and all TEAM_LEVEL contributions going to unallocated.

Changes:
- buildAncestorChainFromReferral: Uses ancestor_path (contains complete user_id chain) to build upline chain
- getDirectReferrer: Gets direct referrer using ancestor_path as fallback
- findAncestorChain: Updated to use ancestor_path when available

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:14:46 -08:00
hailin dbf97ae487 fix(contribution-service): filter adoptions by MINING_ENABLED status
Only process adoptions with MINING_ENABLED status for contribution calculation.
This fixes the bug where non-final adoption records (PENDING, PAID, etc.) were
incorrectly being processed, causing duplicate contribution records.

Affected methods:
- findUndistributedAdoptions: only process MINING_ENABLED adoptions
- getDirectReferralAdoptedCount: only count users with MINING_ENABLED adoptions
- getTotalTreesByAccountSequence: only sum trees from MINING_ENABLED adoptions
- getTeamTreesByLevel: only count MINING_ENABLED adoptions
- countUndistributedAdoptions: only count MINING_ENABLED adoptions

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 19:48:34 -08:00
hailin fdfc2d6700 fix(contribution): ensure 100% reliable CDC sync to mining-admin-service
- Add ContributionAccountUpdatedEvent for real-time account updates
- Publish outbox events when saving distribution results
- Publish outbox events when updating adopter/referrer unlock status
- Add incremental sync every 10 minutes for recently updated accounts
- Add daily full sync at 4am as final consistency guarantee
- Add findRecentlyUpdated repository method for incremental sync

Three-layer sync guarantee:
1. Real-time: publish events on every account update
2. Incremental: scan accounts updated in last 15 minutes every 10 mins
3. Full sync: publish all accounts daily at 4am

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 19:27:50 -08:00
hailin 3999d7cc51 fix(contribution): 100% sync CDC data and fix calculation trigger timing
- Remove conditional skip logic in CDC handlers
- Always sync all field updates (including status changes)
- Trigger contribution calculation only when status becomes MINING_ENABLED
- Fix user and referral handlers to sync all fields without skipping

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 16:55:25 -08:00
hailin 20eabbb85f fix(mining-admin): restore MINING_ENABLED status filter for adoption stats
Revert the previous change that removed the status filter. The stats
should only count adoptions with MINING_ENABLED status, as only those
are active for mining. The issue is likely that the status field in
synced_adoptions table doesn't have the correct value.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 01:32:39 -08:00
hailin 65bd4f9b65 fix(mining-admin): remove MINING_ENABLED status filter for adoption stats
The adoption stats were showing 0 because the synced_adoptions table
contains status values directly from 1.0 system (PAID, POOL_INJECTED, etc.)
rather than MINING_ENABLED. Since contribution-service doesn't update the
status after calculating contributions, we now count all synced adoptions.

Changes:
- Remove status filter in getAdoptionStatsForUsers
- Remove status filter in getUserDetail adoption queries
- Remove status filter in getUserAdoptionStats for referral tree
- Add order count display in user detail page

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 01:21:01 -08:00
hailin 2f3a0f3652 feat(mining-admin): display adoption order count in user management
Backend:
- Add personalOrders and teamOrders to adoption stats
- Return order count alongside tree count in user list API

Frontend:
- Add personalAdoptionOrders and teamAdoptionOrders to UserOverview type
- Display format: "树数量(订单数)" e.g. "6(3单)"

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 01:03:59 -08:00
hailin 56ff8290c1 fix(mining-admin): filter adoption stats by MINING_ENABLED status
Only count adoptions with status='MINING_ENABLED' when calculating:
- Personal adoption count (user list)
- Team adoption count (user list)
- Personal adoption stats (user detail)
- Direct referral adoptions (user detail)
- Team adoptions (user detail)
- Referral tree adoption stats

This fixes incorrect adoption counts that included pending/unconfirmed orders.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 00:58:01 -08:00
hailin 1d7d38a82c fix(frontend): prevent redirect to dashboard on page refresh
Fix hydration race condition where token check happened before
localStorage was read. Now waits for client-side initialization
before deciding whether to redirect to login.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 00:25:59 -08:00
170 changed files with 11015 additions and 2175 deletions

View File

@ -767,7 +767,15 @@
"Bash(git -C \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\" commit -m \"$\\(cat <<''EOF''\nfix\\(mining-app\\): update splash page theme and fix token refresh\n\n- Update splash_page.dart to orange theme \\(#FF6B00\\) matching other pages\n- Change app name from \"榴莲挖矿\" to \"榴莲生态\"\n- Fix refreshTokenIfNeeded to properly throw on failure instead of\n silently calling logout \\(which caused Riverpod ref errors\\)\n- Clear local storage directly on refresh failure without remote API call\n\nCo-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>\nEOF\n\\)\")",
"Bash(python3 -c \" import sys content = sys.stdin.read\\(\\) old = '''''' done # 清空 processed_cdc_events 表(因为 migration 时可能已经消费了一些消息) # 这是事务性幂等消费的关键:重置 Kafka offset 后必须同时清空幂等记录 log_info \"\"Truncating processed_cdc_events tables to allow re-consumption...\"\" for db in \"\"rwa_contribution\"\" \"\"rwa_auth\"\"; do if run_psql \"\"$db\"\" \"\"TRUNCATE TABLE processed_cdc_events;\"\" 2>/dev/null; then log_success \"\"Truncated processed_cdc_events in $db\"\" else log_warn \"\"Could not truncate processed_cdc_events in $db \\(table may not exist yet\\)\"\" fi done log_step \"\"Step 9/18: Starting 2.0 services...\"\"'''''' new = '''''' done # 清空 processed_cdc_events 表(因为 migration 时可能已经消费了一些消息) # 这是事务性幂等消费的关键:重置 Kafka offset 后必须同时清空幂等记录 log_info \"\"Truncating processed_cdc_events tables to allow re-consumption...\"\" for db in \"\"rwa_contribution\"\" \"\"rwa_auth\"\"; do if run_psql \"\"$db\"\" \"\"TRUNCATE TABLE processed_cdc_events;\"\" 2>/dev/null; then log_success \"\"Truncated processed_cdc_events in $db\"\" else log_warn \"\"Could not truncate processed_cdc_events in $db \\(table may not exist yet\\)\"\" fi done log_step \"\"Step 9/18: Starting 2.0 services...\"\"'''''' print\\(content.replace\\(old, new\\)\\) \")",
"Bash(git rm:*)",
"Bash(echo \"请在服务器运行以下命令检查 outbox 事件:\n\ndocker exec -it rwa-postgres psql -U rwa_user -d rwa_contribution -c \"\"\nSELECT id, event_type, aggregate_id, \n payload->>''sourceType'' as source_type,\n payload->>''accountSequence'' as account_seq,\n payload->>''sourceAccountSequence'' as source_account_seq,\n payload->>''bonusTier'' as bonus_tier\nFROM outbox_events \nWHERE payload->>''accountSequence'' = ''D25122900007''\nORDER BY id;\n\"\"\")"
"Bash(echo \"请在服务器运行以下命令检查 outbox 事件:\n\ndocker exec -it rwa-postgres psql -U rwa_user -d rwa_contribution -c \"\"\nSELECT id, event_type, aggregate_id, \n payload->>''sourceType'' as source_type,\n payload->>''accountSequence'' as account_seq,\n payload->>''sourceAccountSequence'' as source_account_seq,\n payload->>''bonusTier'' as bonus_tier\nFROM outbox_events \nWHERE payload->>''accountSequence'' = ''D25122900007''\nORDER BY id;\n\"\"\")",
"Bash(ssh -o ConnectTimeout=10 ceshi@14.215.128.96 'find /home/ceshi/rwadurian/frontend/mining-admin-web -name \"\"*.tsx\"\" -o -name \"\"*.ts\"\" | xargs grep -l \"\"用户管理\\\\|users\"\" 2>/dev/null | head -10')",
"Bash(dir /s /b \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\")",
"Bash(dir /b \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\\\\backend\\\\services\")",
"Bash(ssh -J ceshi@103.39.231.231 ceshi@192.168.1.111 \"curl -s http://localhost:3021/api/v2/admin/status\")",
"Bash(del \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\\\\frontend\\\\mining-app\\\\lib\\\\domain\\\\usecases\\\\trading\\\\buy_shares.dart\")",
"Bash(del \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\\\\frontend\\\\mining-app\\\\lib\\\\domain\\\\usecases\\\\trading\\\\sell_shares.dart\")",
"Bash(ls -la \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\\\\frontend\\\\mining-app\\\\lib\\\\presentation\\\\pages\"\" 2>/dev/null || dir /b \"c:UsersdongDesktoprwadurianfrontendmining-applibpresentationpages \")",
"Bash(cd:*)"
],
"deny": [],
"ask": []

View File

@ -22,7 +22,7 @@ class ChangePasswordDto {
newPassword: string;
}
@Controller('password')
@Controller('auth/password')
@UseGuards(ThrottlerGuard)
export class PasswordController {
constructor(private readonly passwordService: PasswordService) {}

View File

@ -21,7 +21,7 @@ class VerifySmsDto {
type: 'REGISTER' | 'LOGIN' | 'RESET_PASSWORD' | 'CHANGE_PHONE';
}
@Controller('sms')
@Controller('auth/sms')
@UseGuards(ThrottlerGuard)
export class SmsController {
constructor(private readonly smsService: SmsService) {}

View File

@ -7,7 +7,7 @@ import { UserService, UserProfileResult } from '@/application/services';
import { JwtAuthGuard } from '@/shared/guards/jwt-auth.guard';
import { CurrentUser } from '@/shared/decorators/current-user.decorator';
@Controller('user')
@Controller('auth/user')
@UseGuards(JwtAuthGuard)
export class UserController {
constructor(private readonly userService: UserService) {}

View File

@ -1,8 +1,9 @@
import { Controller, Get, Param, Query, NotFoundException } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiResponse, ApiParam } from '@nestjs/swagger';
import { ApiTags, ApiOperation, ApiResponse, ApiParam, ApiQuery } from '@nestjs/swagger';
import { GetContributionAccountQuery } from '../../application/queries/get-contribution-account.query';
import { GetContributionStatsQuery } from '../../application/queries/get-contribution-stats.query';
import { GetContributionRankingQuery } from '../../application/queries/get-contribution-ranking.query';
import { GetPlantingLedgerQuery, PlantingLedgerDto } from '../../application/queries/get-planting-ledger.query';
import {
ContributionAccountResponse,
ContributionRecordsResponse,
@ -19,6 +20,7 @@ export class ContributionController {
private readonly getAccountQuery: GetContributionAccountQuery,
private readonly getStatsQuery: GetContributionStatsQuery,
private readonly getRankingQuery: GetContributionRankingQuery,
private readonly getPlantingLedgerQuery: GetPlantingLedgerQuery,
) {}
@Get('stats')
@ -95,4 +97,22 @@ export class ContributionController {
}
return result;
}
@Get('accounts/:accountSequence/planting-ledger')
@ApiOperation({ summary: '获取账户认种分类账' })
@ApiParam({ name: 'accountSequence', description: '账户序号' })
@ApiQuery({ name: 'page', required: false, type: Number, description: '页码' })
@ApiQuery({ name: 'pageSize', required: false, type: Number, description: '每页数量' })
@ApiResponse({ status: 200, description: '认种分类账' })
async getPlantingLedger(
@Param('accountSequence') accountSequence: string,
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
): Promise<PlantingLedgerDto> {
return this.getPlantingLedgerQuery.execute(
accountSequence,
page ?? 1,
pageSize ?? 20,
);
}
}

View File

@ -2,6 +2,7 @@ import { Controller, Get } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiResponse } from '@nestjs/swagger';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
import { RedisService } from '../../infrastructure/redis/redis.service';
import { CDCConsumerService } from '../../infrastructure/kafka/cdc-consumer.service';
import { Public } from '../../shared/guards/jwt-auth.guard';
interface HealthStatus {
@ -20,6 +21,7 @@ export class HealthController {
constructor(
private readonly prisma: PrismaService,
private readonly redis: RedisService,
private readonly cdcConsumer: CDCConsumerService,
) {}
@Get()
@ -68,4 +70,15 @@ export class HealthController {
async live(): Promise<{ alive: boolean }> {
return { alive: true };
}
@Get('cdc-sync')
@ApiOperation({ summary: 'CDC 同步状态检查' })
@ApiResponse({ status: 200, description: 'CDC 同步状态' })
async cdcSyncStatus(): Promise<{
isRunning: boolean;
sequentialMode: boolean;
allPhasesCompleted: boolean;
}> {
return this.cdcConsumer.getSyncStatus();
}
}

View File

@ -12,12 +12,14 @@ import { CDCEventDispatcher } from './event-handlers/cdc-event-dispatcher';
import { ContributionCalculationService } from './services/contribution-calculation.service';
import { ContributionDistributionPublisherService } from './services/contribution-distribution-publisher.service';
import { ContributionRateService } from './services/contribution-rate.service';
import { BonusClaimService } from './services/bonus-claim.service';
import { SnapshotService } from './services/snapshot.service';
// Queries
import { GetContributionAccountQuery } from './queries/get-contribution-account.query';
import { GetContributionStatsQuery } from './queries/get-contribution-stats.query';
import { GetContributionRankingQuery } from './queries/get-contribution-ranking.query';
import { GetPlantingLedgerQuery } from './queries/get-planting-ledger.query';
// Schedulers
import { ContributionScheduler } from './schedulers/contribution.scheduler';
@ -38,12 +40,14 @@ import { ContributionScheduler } from './schedulers/contribution.scheduler';
ContributionCalculationService,
ContributionDistributionPublisherService,
ContributionRateService,
BonusClaimService,
SnapshotService,
// Queries
GetContributionAccountQuery,
GetContributionStatsQuery,
GetContributionRankingQuery,
GetPlantingLedgerQuery,
// Schedulers
ContributionScheduler,
@ -55,6 +59,7 @@ import { ContributionScheduler } from './schedulers/contribution.scheduler';
GetContributionAccountQuery,
GetContributionStatsQuery,
GetContributionRankingQuery,
GetPlantingLedgerQuery,
],
})
export class ApplicationModule {}

View File

@ -2,6 +2,7 @@ import { Injectable, Logger } from '@nestjs/common';
import Decimal from 'decimal.js';
import { CDCEvent, TransactionClient } from '../../infrastructure/kafka/cdc-consumer.service';
import { ContributionCalculationService } from '../services/contribution-calculation.service';
import { ContributionRateService } from '../services/contribution-rate.service';
/**
*
@ -15,19 +16,11 @@ export interface AdoptionSyncResult {
* CDC
* 1.0 planting-service同步过来的planting_orders数据
*
*
*
* ===========================================
* - handle() synced_adoptions
* - AdoptionSyncResultID
* - calculateForAdoption
*
* calculateForAdoption
* 1. calculateForAdoption 使
* 2. Serializable
* 3. "Adoption not found" synced_adoptions
*
* Kafka Idempotent Consumer & Transactional Outbox Pattern
* https://www.lydtechconsulting.com/blog/kafka-idempotent-consumer-transactional-outbox
* - handle() 100%
* - status MINING_ENABLED
* - Serializable
*/
@Injectable()
export class AdoptionSyncedHandler {
@ -35,6 +28,7 @@ export class AdoptionSyncedHandler {
constructor(
private readonly contributionCalculationService: ContributionCalculationService,
private readonly contributionRateService: ContributionRateService,
) {}
/**
@ -48,13 +42,28 @@ export class AdoptionSyncedHandler {
this.logger.log(`[CDC] Adoption event received: op=${op}, seq=${event.sequenceNum}`);
this.logger.debug(`[CDC] Adoption event payload: ${JSON.stringify(after || before)}`);
// 获取认种日期,用于查询当日贡献值
const data = after || before;
const adoptionDate = data?.created_at || data?.createdAt || data?.paid_at || data?.paidAt;
// 在事务外获取当日每棵树的贡献值
let contributionPerTree = new Decimal('22617'); // 默认值
if (adoptionDate) {
try {
contributionPerTree = await this.contributionRateService.getContributionPerTree(new Date(adoptionDate));
this.logger.log(`[CDC] Got contributionPerTree for ${adoptionDate}: ${contributionPerTree.toString()}`);
} catch (error) {
this.logger.warn(`[CDC] Failed to get contributionPerTree, using default 22617`, error);
}
}
try {
switch (op) {
case 'c': // create
case 'r': // read (snapshot)
return await this.handleCreate(after, event.sequenceNum, tx);
return await this.handleCreate(after, event.sequenceNum, tx, contributionPerTree);
case 'u': // update
return await this.handleUpdate(after, before, event.sequenceNum, tx);
return await this.handleUpdate(after, before, event.sequenceNum, tx, contributionPerTree);
case 'd': // delete
await this.handleDelete(before);
return null;
@ -86,21 +95,21 @@ export class AdoptionSyncedHandler {
}
}
private async handleCreate(data: any, sequenceNum: bigint, tx: TransactionClient): Promise<AdoptionSyncResult | null> {
private async handleCreate(data: any, sequenceNum: bigint, tx: TransactionClient, contributionPerTree: Decimal): Promise<AdoptionSyncResult | null> {
if (!data) {
this.logger.warn(`[CDC] Adoption create: empty data received`);
return null;
}
// planting_orders表字段: order_id, account_sequence, tree_count, created_at, status, selected_province, selected_city
const orderId = data.order_id || data.id;
const accountSequence = data.account_sequence || data.accountSequence;
const treeCount = data.tree_count || data.treeCount;
const createdAt = data.created_at || data.createdAt || data.paid_at || data.paidAt;
const selectedProvince = data.selected_province || data.selectedProvince || null;
const selectedCity = data.selected_city || data.selectedCity || null;
const status = data.status ?? null;
this.logger.log(`[CDC] Adoption create: orderId=${orderId}, account=${accountSequence}, trees=${treeCount}, province=${selectedProvince}, city=${selectedCity}`);
this.logger.log(`[CDC] Adoption create: orderId=${orderId}, account=${accountSequence}, trees=${treeCount}, status=${status}, contributionPerTree=${contributionPerTree.toString()}`);
if (!orderId || !accountSequence) {
this.logger.warn(`[CDC] Invalid adoption data: missing order_id or account_sequence`, { data });
@ -109,8 +118,7 @@ export class AdoptionSyncedHandler {
const originalAdoptionId = BigInt(orderId);
// 在事务中保存同步的认种订单数据
this.logger.log(`[CDC] Upserting synced adoption: ${orderId}`);
// 100%同步数据,使用真实的每棵树贡献值
await tx.syncedAdoption.upsert({
where: { originalAdoptionId },
create: {
@ -118,10 +126,10 @@ export class AdoptionSyncedHandler {
accountSequence,
treeCount,
adoptionDate: new Date(createdAt),
status: data.status ?? null,
status,
selectedProvince,
selectedCity,
contributionPerTree: new Decimal('1'), // 每棵树1算力
contributionPerTree,
sourceSequenceNum: sequenceNum,
syncedAt: new Date(),
},
@ -129,25 +137,26 @@ export class AdoptionSyncedHandler {
accountSequence,
treeCount,
adoptionDate: new Date(createdAt),
status: data.status ?? undefined,
selectedProvince: selectedProvince ?? undefined,
selectedCity: selectedCity ?? undefined,
contributionPerTree: new Decimal('1'),
status,
selectedProvince,
selectedCity,
contributionPerTree,
sourceSequenceNum: sequenceNum,
syncedAt: new Date(),
},
});
this.logger.log(`[CDC] Adoption synced successfully: orderId=${orderId}, account=${accountSequence}, trees=${treeCount}`);
this.logger.log(`[CDC] Adoption synced: orderId=${orderId}, status=${status}`);
// 返回结果,供事务提交后计算算力
// 只有 MINING_ENABLED 状态才触发算力计算
const needsCalculation = status === 'MINING_ENABLED';
return {
originalAdoptionId,
needsCalculation: true,
needsCalculation,
};
}
private async handleUpdate(after: any, before: any, sequenceNum: bigint, tx: TransactionClient): Promise<AdoptionSyncResult | null> {
private async handleUpdate(after: any, before: any, sequenceNum: bigint, tx: TransactionClient, contributionPerTree: Decimal): Promise<AdoptionSyncResult | null> {
if (!after) {
this.logger.warn(`[CDC] Adoption update: empty after data received`);
return null;
@ -155,37 +164,22 @@ export class AdoptionSyncedHandler {
const orderId = after.order_id || after.id;
const originalAdoptionId = BigInt(orderId);
this.logger.log(`[CDC] Adoption update: orderId=${orderId}`);
// 检查是否已经处理过(使用事务客户端)
const existingAdoption = await tx.syncedAdoption.findUnique({
where: { originalAdoptionId },
});
if (existingAdoption?.contributionDistributed) {
// 如果树数量发生变化,需要重新计算(这种情况较少)
const newTreeCount = after.tree_count || after.treeCount;
if (existingAdoption.treeCount !== newTreeCount) {
this.logger.warn(
`[CDC] Adoption tree count changed after processing: ${originalAdoptionId}, old=${existingAdoption.treeCount}, new=${newTreeCount}. This requires special handling.`,
);
// TODO: 实现树数量变化的处理逻辑
} else {
this.logger.debug(`[CDC] Adoption ${orderId} already distributed, skipping update`);
}
return null;
}
const accountSequence = after.account_sequence || after.accountSequence;
const treeCount = after.tree_count || after.treeCount;
const createdAt = after.created_at || after.createdAt || after.paid_at || after.paidAt;
const selectedProvince = after.selected_province || after.selectedProvince || null;
const selectedCity = after.selected_city || after.selectedCity || null;
const newStatus = after.status ?? null;
const oldStatus = before?.status ?? null;
this.logger.log(`[CDC] Adoption update data: account=${accountSequence}, trees=${treeCount}, province=${selectedProvince}, city=${selectedCity}`);
this.logger.log(`[CDC] Adoption update: orderId=${orderId}, status=${oldStatus} -> ${newStatus}, contributionPerTree=${contributionPerTree.toString()}`);
// 在事务中保存同步的认种订单数据
// 查询现有记录
const existingAdoption = await tx.syncedAdoption.findUnique({
where: { originalAdoptionId },
});
// 100%同步数据,使用真实的每棵树贡献值
await tx.syncedAdoption.upsert({
where: { originalAdoptionId },
create: {
@ -193,10 +187,10 @@ export class AdoptionSyncedHandler {
accountSequence,
treeCount,
adoptionDate: new Date(createdAt),
status: after.status ?? null,
status: newStatus,
selectedProvince,
selectedCity,
contributionPerTree: new Decimal('1'),
contributionPerTree,
sourceSequenceNum: sequenceNum,
syncedAt: new Date(),
},
@ -204,21 +198,24 @@ export class AdoptionSyncedHandler {
accountSequence,
treeCount,
adoptionDate: new Date(createdAt),
status: after.status ?? undefined,
selectedProvince: selectedProvince ?? undefined,
selectedCity: selectedCity ?? undefined,
contributionPerTree: new Decimal('1'),
status: newStatus,
selectedProvince,
selectedCity,
contributionPerTree,
sourceSequenceNum: sequenceNum,
syncedAt: new Date(),
},
});
this.logger.log(`[CDC] Adoption updated successfully: ${originalAdoptionId}`);
this.logger.log(`[CDC] Adoption synced: orderId=${orderId}, status=${newStatus}`);
// 只有当 status 变为 MINING_ENABLED 且尚未计算过算力时,才触发算力计算
const statusChangedToMiningEnabled = newStatus === 'MINING_ENABLED' && oldStatus !== 'MINING_ENABLED';
const needsCalculation = statusChangedToMiningEnabled && !existingAdoption?.contributionDistributed;
// 只有尚未分配算力的认种才需要计算
return {
originalAdoptionId,
needsCalculation: !existingAdoption?.contributionDistributed,
needsCalculation,
};
}

View File

@ -51,14 +51,17 @@ export class CDCEventDispatcher implements OnModuleInit {
this.handleAdoptionPostCommit.bind(this),
);
// 启动 CDC 消费者
try {
await this.cdcConsumer.start();
this.logger.log('CDC event dispatcher started with transactional idempotency');
} catch (error) {
this.logger.error('Failed to start CDC event dispatcher', error);
// 不抛出错误,允许服务在没有 Kafka 的情况下启动(用于本地开发)
}
// 非阻塞启动 CDC 消费者
// 让 HTTP 服务器先启动CDC 同步在后台进行
// 脚本通过 /health/cdc-sync API 轮询同步状态
this.cdcConsumer.start()
.then(() => {
this.logger.log('CDC event dispatcher started with transactional idempotency');
})
.catch((error) => {
this.logger.error('Failed to start CDC event dispatcher', error);
// 不抛出错误,允许服务在没有 Kafka 的情况下启动(用于本地开发)
});
}
private async handleUserEvent(event: CDCEvent, tx: TransactionClient): Promise<void> {

View File

@ -5,22 +5,7 @@ import { CDCEvent, TransactionClient } from '../../infrastructure/kafka/cdc-cons
* CDC
* 1.0 referral-service同步过来的referral_relationships数据
*
* 1.0 (referral_relationships):
* - user_id: BigInt (ID)
* - account_sequence: String ()
* - referrer_id: BigInt (ID, account_sequence)
* - ancestor_path: BigInt[] ( user_id)
* - depth: Int ()
*
* 2.0 :
* - original_user_id (1.0 user_id)
* - referrer_user_id (1.0 referrer_id)
* - referrer account_sequence
* - ancestor_path
*
* handler tx
* 使
*
* 100%
*/
@Injectable()
export class ReferralSyncedHandler {
@ -61,12 +46,11 @@ export class ReferralSyncedHandler {
return;
}
// 1.0 字段映射
const accountSequence = data.account_sequence || data.accountSequence;
const originalUserId = data.user_id || data.userId;
const referrerUserId = data.referrer_id || data.referrerId;
const ancestorPathArray = data.ancestor_path || data.ancestorPath;
const depth = data.depth || 0;
const depth = data.depth ?? 0;
this.logger.log(`[CDC] Referral create: account=${accountSequence}, userId=${originalUserId}, referrerId=${referrerUserId}, depth=${depth}`);
@ -75,11 +59,9 @@ export class ReferralSyncedHandler {
return;
}
// 将 BigInt[] 转换为逗号分隔的字符串
const ancestorPath = this.convertAncestorPath(ancestorPathArray);
this.logger.debug(`[CDC] Referral ancestorPath converted: ${ancestorPath}`);
// 尝试查找推荐人的 account_sequence(使用事务客户端)
// 尝试查找推荐人的 account_sequence
let referrerAccountSequence: string | null = null;
if (referrerUserId) {
const referrer = await tx.syncedReferral.findFirst({
@ -87,14 +69,10 @@ export class ReferralSyncedHandler {
});
if (referrer) {
referrerAccountSequence = referrer.accountSequence;
this.logger.debug(`[CDC] Found referrer account_sequence: ${referrerAccountSequence} for referrer_id: ${referrerUserId}`);
} else {
this.logger.log(`[CDC] Referrer user_id ${referrerUserId} not found yet for ${accountSequence}, will resolve later`);
}
}
// 使用外部事务客户端执行所有操作
this.logger.log(`[CDC] Upserting synced referral: ${accountSequence}`);
// 100%同步数据
await tx.syncedReferral.upsert({
where: { accountSequence },
create: {
@ -108,17 +86,17 @@ export class ReferralSyncedHandler {
syncedAt: new Date(),
},
update: {
referrerAccountSequence: referrerAccountSequence ?? undefined,
referrerUserId: referrerUserId ? BigInt(referrerUserId) : undefined,
originalUserId: originalUserId ? BigInt(originalUserId) : undefined,
ancestorPath: ancestorPath ?? undefined,
depth: depth ?? undefined,
referrerAccountSequence,
referrerUserId: referrerUserId ? BigInt(referrerUserId) : null,
originalUserId: originalUserId ? BigInt(originalUserId) : null,
ancestorPath,
depth,
sourceSequenceNum: sequenceNum,
syncedAt: new Date(),
},
});
this.logger.log(`[CDC] Referral synced successfully: ${accountSequence} (user_id: ${originalUserId}) -> referrer_id: ${referrerUserId || 'none'}, depth: ${depth}`);
this.logger.log(`[CDC] Referral synced: ${accountSequence}, referrerId=${referrerUserId || 'none'}, depth=${depth}`);
}
private async handleUpdate(data: any, sequenceNum: bigint, tx: TransactionClient): Promise<void> {
@ -131,7 +109,7 @@ export class ReferralSyncedHandler {
const originalUserId = data.user_id || data.userId;
const referrerUserId = data.referrer_id || data.referrerId;
const ancestorPathArray = data.ancestor_path || data.ancestorPath;
const depth = data.depth || 0;
const depth = data.depth ?? 0;
this.logger.log(`[CDC] Referral update: account=${accountSequence}, referrerId=${referrerUserId}, depth=${depth}`);
@ -142,7 +120,7 @@ export class ReferralSyncedHandler {
const ancestorPath = this.convertAncestorPath(ancestorPathArray);
// 尝试查找推荐人的 account_sequence(使用事务客户端)
// 尝试查找推荐人的 account_sequence
let referrerAccountSequence: string | null = null;
if (referrerUserId) {
const referrer = await tx.syncedReferral.findFirst({
@ -150,10 +128,10 @@ export class ReferralSyncedHandler {
});
if (referrer) {
referrerAccountSequence = referrer.accountSequence;
this.logger.debug(`[CDC] Found referrer account_sequence: ${referrerAccountSequence}`);
}
}
// 100%同步数据
await tx.syncedReferral.upsert({
where: { accountSequence },
create: {
@ -167,17 +145,17 @@ export class ReferralSyncedHandler {
syncedAt: new Date(),
},
update: {
referrerAccountSequence: referrerAccountSequence ?? undefined,
referrerUserId: referrerUserId ? BigInt(referrerUserId) : undefined,
originalUserId: originalUserId ? BigInt(originalUserId) : undefined,
ancestorPath: ancestorPath ?? undefined,
depth: depth ?? undefined,
referrerAccountSequence,
referrerUserId: referrerUserId ? BigInt(referrerUserId) : null,
originalUserId: originalUserId ? BigInt(originalUserId) : null,
ancestorPath,
depth,
sourceSequenceNum: sequenceNum,
syncedAt: new Date(),
},
});
this.logger.log(`[CDC] Referral updated successfully: ${accountSequence}`);
this.logger.log(`[CDC] Referral synced: ${accountSequence}`);
}
private async handleDelete(data: any): Promise<void> {

View File

@ -6,9 +6,7 @@ import { ContributionAccountAggregate } from '../../domain/aggregates/contributi
* CDC
*
*
* handler tx
* 使
*
* 100%
*/
@Injectable()
export class UserSyncedHandler {
@ -49,22 +47,19 @@ export class UserSyncedHandler {
return;
}
// 兼容不同的字段命名CDC 使用 snake_case
const userId = data.user_id ?? data.id;
const accountSequence = data.account_sequence ?? data.accountSequence;
const phone = data.phone_number ?? data.phone ?? null;
const status = data.status ?? 'ACTIVE';
const status = data.status ?? null;
this.logger.log(`[CDC] User create: userId=${userId}, accountSequence=${accountSequence}, phone=${phone}, status=${status}`);
this.logger.log(`[CDC] User create: userId=${userId}, accountSequence=${accountSequence}, status=${status}`);
if (!userId || !accountSequence) {
this.logger.warn(`[CDC] Invalid user data: missing user_id or account_sequence`, { data });
return;
}
// 使用外部事务客户端执行所有操作
// 保存同步的用户数据
this.logger.log(`[CDC] Upserting synced user: ${accountSequence}`);
// 100%同步数据
await tx.syncedUser.upsert({
where: { accountSequence },
create: {
@ -76,8 +71,9 @@ export class UserSyncedHandler {
syncedAt: new Date(),
},
update: {
phone: phone ?? undefined,
status: status ?? undefined,
originalUserId: BigInt(userId),
phone,
status,
sourceSequenceNum: sequenceNum,
syncedAt: new Date(),
},
@ -95,11 +91,9 @@ export class UserSyncedHandler {
data: persistData,
});
this.logger.log(`[CDC] Created contribution account for user: ${accountSequence}`);
} else {
this.logger.debug(`[CDC] Contribution account already exists for user: ${accountSequence}`);
}
this.logger.log(`[CDC] User synced successfully: ${accountSequence}`);
this.logger.log(`[CDC] User synced: ${accountSequence}`);
}
private async handleUpdate(data: any, sequenceNum: bigint, tx: TransactionClient): Promise<void> {
@ -108,11 +102,10 @@ export class UserSyncedHandler {
return;
}
// 兼容不同的字段命名CDC 使用 snake_case
const userId = data.user_id ?? data.id;
const accountSequence = data.account_sequence ?? data.accountSequence;
const phone = data.phone_number ?? data.phone ?? null;
const status = data.status ?? 'ACTIVE';
const status = data.status ?? null;
this.logger.log(`[CDC] User update: userId=${userId}, accountSequence=${accountSequence}, status=${status}`);
@ -121,6 +114,7 @@ export class UserSyncedHandler {
return;
}
// 100%同步数据
await tx.syncedUser.upsert({
where: { accountSequence },
create: {
@ -132,14 +126,15 @@ export class UserSyncedHandler {
syncedAt: new Date(),
},
update: {
phone: phone ?? undefined,
status: status ?? undefined,
originalUserId: BigInt(userId),
phone,
status,
sourceSequenceNum: sequenceNum,
syncedAt: new Date(),
},
});
this.logger.log(`[CDC] User updated successfully: ${accountSequence}`);
this.logger.log(`[CDC] User synced: ${accountSequence}`);
}
private async handleDelete(data: any): Promise<void> {

View File

@ -183,16 +183,16 @@ export class GetContributionAccountQuery {
private toRecordDto(record: any): ContributionRecordDto {
return {
id: record.id,
id: record.id?.toString() ?? '',
sourceType: record.sourceType,
sourceAdoptionId: record.sourceAdoptionId,
sourceAdoptionId: record.sourceAdoptionId?.toString() ?? '',
sourceAccountSequence: record.sourceAccountSequence,
treeCount: record.treeCount,
baseContribution: record.baseContribution.value.toString(),
distributionRate: record.distributionRate.value.toString(),
baseContribution: record.baseContribution?.value?.toString() ?? '0',
distributionRate: record.distributionRate?.value?.toString() ?? '0',
levelDepth: record.levelDepth,
bonusTier: record.bonusTier,
finalContribution: record.finalContribution.value.toString(),
finalContribution: record.amount?.value?.toString() ?? '0',
effectiveDate: record.effectiveDate,
expireDate: record.expireDate,
isExpired: record.isExpired,

View File

@ -0,0 +1,74 @@
import { Injectable } from '@nestjs/common';
import { SyncedDataRepository } from '../../infrastructure/persistence/repositories/synced-data.repository';
export interface PlantingRecordDto {
orderId: string;
orderNo: string;
originalAdoptionId: string;
treeCount: number;
contributionPerTree: string;
totalContribution: string;
status: string;
adoptionDate: string | null;
createdAt: string;
}
export interface PlantingSummaryDto {
totalOrders: number;
totalTreeCount: number;
totalAmount: string;
effectiveTreeCount: number;
firstPlantingAt: string | null;
lastPlantingAt: string | null;
}
export interface PlantingLedgerDto {
summary: PlantingSummaryDto;
items: PlantingRecordDto[];
total: number;
page: number;
pageSize: number;
totalPages: number;
}
@Injectable()
export class GetPlantingLedgerQuery {
constructor(private readonly syncedDataRepository: SyncedDataRepository) {}
async execute(
accountSequence: string,
page: number = 1,
pageSize: number = 20,
): Promise<PlantingLedgerDto> {
const [summary, ledger] = await Promise.all([
this.syncedDataRepository.getPlantingSummary(accountSequence),
this.syncedDataRepository.getPlantingLedger(accountSequence, page, pageSize),
]);
return {
summary: {
totalOrders: summary.totalOrders,
totalTreeCount: summary.totalTreeCount,
totalAmount: summary.totalAmount,
effectiveTreeCount: summary.effectiveTreeCount,
firstPlantingAt: summary.firstPlantingAt?.toISOString() || null,
lastPlantingAt: summary.lastPlantingAt?.toISOString() || null,
},
items: ledger.items.map((item) => ({
orderId: item.id.toString(),
orderNo: `ORD-${item.originalAdoptionId}`,
originalAdoptionId: item.originalAdoptionId.toString(),
treeCount: item.treeCount,
contributionPerTree: item.contributionPerTree.toString(),
totalContribution: item.contributionPerTree.mul(item.treeCount).toString(),
status: item.status || 'UNKNOWN',
adoptionDate: item.adoptionDate?.toISOString() || null,
createdAt: item.createdAt.toISOString(),
})),
total: ledger.total,
page: ledger.page,
pageSize: ledger.pageSize,
totalPages: ledger.totalPages,
};
}
}

View File

@ -3,9 +3,11 @@ import { Cron, CronExpression } from '@nestjs/schedule';
import { ContributionCalculationService } from '../services/contribution-calculation.service';
import { SnapshotService } from '../services/snapshot.service';
import { ContributionRecordRepository } from '../../infrastructure/persistence/repositories/contribution-record.repository';
import { ContributionAccountRepository } from '../../infrastructure/persistence/repositories/contribution-account.repository';
import { OutboxRepository } from '../../infrastructure/persistence/repositories/outbox.repository';
import { KafkaProducerService } from '../../infrastructure/kafka/kafka-producer.service';
import { RedisService } from '../../infrastructure/redis/redis.service';
import { ContributionAccountUpdatedEvent } from '../../domain/events';
/**
*
@ -19,6 +21,7 @@ export class ContributionScheduler implements OnModuleInit {
private readonly calculationService: ContributionCalculationService,
private readonly snapshotService: SnapshotService,
private readonly contributionRecordRepository: ContributionRecordRepository,
private readonly contributionAccountRepository: ContributionAccountRepository,
private readonly outboxRepository: OutboxRepository,
private readonly kafkaProducer: KafkaProducerService,
private readonly redis: RedisService,
@ -174,4 +177,128 @@ export class ContributionScheduler implements OnModuleInit {
await this.redis.releaseLock(`${this.LOCK_KEY}:cleanup`, lockValue);
}
}
/**
* 10
* 15
*/
@Cron('*/10 * * * *')
async publishRecentlyUpdatedAccounts(): Promise<void> {
const lockValue = await this.redis.acquireLock(`${this.LOCK_KEY}:incremental-sync`, 540); // 9分钟锁
if (!lockValue) {
return;
}
try {
// 查找过去15分钟内更新的账户比10分钟多5分钟余量避免遗漏边界情况
const fifteenMinutesAgo = new Date(Date.now() - 15 * 60 * 1000);
const accounts = await this.contributionAccountRepository.findRecentlyUpdated(fifteenMinutesAgo, 500);
if (accounts.length === 0) {
return;
}
const events = accounts.map((account) => {
const event = new ContributionAccountUpdatedEvent(
account.accountSequence,
account.personalContribution.value.toString(),
account.totalLevelPending.value.toString(),
account.totalBonusPending.value.toString(),
account.effectiveContribution.value.toString(),
account.effectiveContribution.value.toString(),
account.hasAdopted,
account.directReferralAdoptedCount,
account.unlockedLevelDepth,
account.unlockedBonusTiers,
account.createdAt,
);
return {
aggregateType: ContributionAccountUpdatedEvent.AGGREGATE_TYPE,
aggregateId: account.accountSequence,
eventType: ContributionAccountUpdatedEvent.EVENT_TYPE,
payload: event.toPayload(),
};
});
await this.outboxRepository.saveMany(events);
this.logger.log(`Incremental sync: published ${accounts.length} recently updated accounts`);
} catch (error) {
this.logger.error('Failed to publish recently updated accounts', error);
} finally {
await this.redis.releaseLock(`${this.LOCK_KEY}:incremental-sync`, lockValue);
}
}
/**
* 4
*
*/
@Cron('0 4 * * *')
async publishAllAccountUpdates(): Promise<void> {
const lockValue = await this.redis.acquireLock(`${this.LOCK_KEY}:full-sync`, 3600); // 1小时锁
if (!lockValue) {
return;
}
try {
this.logger.log('Starting daily full sync of contribution accounts...');
let page = 1;
const pageSize = 100;
let totalPublished = 0;
while (true) {
const { items: accounts, total } = await this.contributionAccountRepository.findMany({
page,
limit: pageSize,
orderBy: 'effectiveContribution',
order: 'desc',
});
if (accounts.length === 0) {
break;
}
const events = accounts.map((account) => {
const event = new ContributionAccountUpdatedEvent(
account.accountSequence,
account.personalContribution.value.toString(),
account.totalLevelPending.value.toString(),
account.totalBonusPending.value.toString(),
account.effectiveContribution.value.toString(),
account.effectiveContribution.value.toString(),
account.hasAdopted,
account.directReferralAdoptedCount,
account.unlockedLevelDepth,
account.unlockedBonusTiers,
account.createdAt,
);
return {
aggregateType: ContributionAccountUpdatedEvent.AGGREGATE_TYPE,
aggregateId: account.accountSequence,
eventType: ContributionAccountUpdatedEvent.EVENT_TYPE,
payload: event.toPayload(),
};
});
await this.outboxRepository.saveMany(events);
totalPublished += accounts.length;
if (accounts.length < pageSize || page * pageSize >= total) {
break;
}
page++;
}
this.logger.log(`Daily full sync completed: published ${totalPublished} contribution account events`);
} catch (error) {
this.logger.error('Failed to publish all account updates', error);
} finally {
await this.redis.releaseLock(`${this.LOCK_KEY}:full-sync`, lockValue);
}
}
}

View File

@ -0,0 +1,218 @@
import { Injectable, Logger } from '@nestjs/common';
import { UnallocatedContributionRepository, UnallocatedContribution } from '../../infrastructure/persistence/repositories/unallocated-contribution.repository';
import { ContributionAccountRepository } from '../../infrastructure/persistence/repositories/contribution-account.repository';
import { ContributionRecordRepository } from '../../infrastructure/persistence/repositories/contribution-record.repository';
import { OutboxRepository } from '../../infrastructure/persistence/repositories/outbox.repository';
import { UnitOfWork } from '../../infrastructure/persistence/unit-of-work/unit-of-work';
import { ContributionRecordAggregate } from '../../domain/aggregates/contribution-record.aggregate';
import { ContributionSourceType } from '../../domain/aggregates/contribution-account.aggregate';
import { ContributionAmount } from '../../domain/value-objects/contribution-amount.vo';
import { DistributionRate } from '../../domain/value-objects/distribution-rate.vo';
import { ContributionRecordSyncedEvent } from '../../domain/events';
/**
*
*
*/
@Injectable()
export class BonusClaimService {
private readonly logger = new Logger(BonusClaimService.name);
constructor(
private readonly unallocatedContributionRepository: UnallocatedContributionRepository,
private readonly contributionAccountRepository: ContributionAccountRepository,
private readonly contributionRecordRepository: ContributionRecordRepository,
private readonly outboxRepository: OutboxRepository,
private readonly unitOfWork: UnitOfWork,
) {}
/**
*
*
* @param accountSequence
* @param previousCount
* @param newCount
*/
async checkAndClaimBonus(
accountSequence: string,
previousCount: number,
newCount: number,
): Promise<void> {
// 检查是否达到新的解锁条件
const tiersToClaimList: number[] = [];
// T2: 直推≥2人认种时解锁
if (previousCount < 2 && newCount >= 2) {
tiersToClaimList.push(2);
}
// T3: 直推≥4人认种时解锁
if (previousCount < 4 && newCount >= 4) {
tiersToClaimList.push(3);
}
if (tiersToClaimList.length === 0) {
return;
}
this.logger.log(
`User ${accountSequence} unlocked bonus tiers: ${tiersToClaimList.join(', ')} ` +
`(directReferralAdoptedCount: ${previousCount} -> ${newCount})`,
);
// 检查是否已在事务中(被 ContributionCalculationService 调用时)
// 如果已在事务中,直接执行,避免嵌套事务导致超时
if (this.unitOfWork.isInTransaction()) {
for (const tier of tiersToClaimList) {
await this.claimBonusTier(accountSequence, tier);
}
} else {
// 独立调用时,开启新事务
await this.unitOfWork.executeInTransaction(async () => {
for (const tier of tiersToClaimList) {
await this.claimBonusTier(accountSequence, tier);
}
});
}
}
/**
*
*/
private async claimBonusTier(accountSequence: string, bonusTier: number): Promise<void> {
// 1. 查询待领取的记录
const pendingRecords = await this.unallocatedContributionRepository.findPendingBonusByAccountSequence(
accountSequence,
bonusTier,
);
if (pendingRecords.length === 0) {
this.logger.debug(`No pending T${bonusTier} bonus records for ${accountSequence}`);
return;
}
this.logger.log(
`Claiming ${pendingRecords.length} T${bonusTier} bonus records for ${accountSequence}`,
);
// 2. 创建贡献值记录
const contributionRecords: ContributionRecordAggregate[] = [];
for (const pending of pendingRecords) {
const record = new ContributionRecordAggregate({
accountSequence: accountSequence,
sourceType: ContributionSourceType.TEAM_BONUS,
sourceAdoptionId: pending.sourceAdoptionId,
sourceAccountSequence: pending.sourceAccountSequence,
treeCount: 0, // 补发记录不记录树数
baseContribution: new ContributionAmount(0),
distributionRate: DistributionRate.BONUS_PER,
bonusTier: bonusTier,
amount: pending.amount,
effectiveDate: pending.effectiveDate,
expireDate: pending.expireDate,
});
contributionRecords.push(record);
}
// 3. 保存贡献值记录
const savedRecords = await this.contributionRecordRepository.saveMany(contributionRecords);
// 4. 更新用户的贡献值账户
let totalAmount = new ContributionAmount(0);
for (const pending of pendingRecords) {
totalAmount = new ContributionAmount(totalAmount.value.plus(pending.amount.value));
}
await this.contributionAccountRepository.updateContribution(
accountSequence,
ContributionSourceType.TEAM_BONUS,
totalAmount,
null,
bonusTier,
);
// 5. 标记待领取记录为已分配
const pendingIds = pendingRecords.map((r) => r.id);
await this.unallocatedContributionRepository.claimBonusRecords(pendingIds, accountSequence);
// 6. 发布事件到 Kafka通过 Outbox
await this.publishBonusClaimEvents(accountSequence, savedRecords, pendingRecords);
this.logger.log(
`Claimed T${bonusTier} bonus for ${accountSequence}: ` +
`${pendingRecords.length} records, total amount: ${totalAmount.value.toString()}`,
);
}
/**
*
*/
private async publishBonusClaimEvents(
accountSequence: string,
savedRecords: ContributionRecordAggregate[],
pendingRecords: UnallocatedContribution[],
): Promise<void> {
// 1. 发布贡献值记录同步事件(用于 mining-admin-service CDC
for (const record of savedRecords) {
const event = new ContributionRecordSyncedEvent(
record.id!,
record.accountSequence,
record.sourceType,
record.sourceAdoptionId,
record.sourceAccountSequence,
record.treeCount,
record.baseContribution.value.toString(),
record.distributionRate.value.toString(),
record.levelDepth,
record.bonusTier,
record.amount.value.toString(),
record.effectiveDate,
record.expireDate,
record.isExpired,
record.createdAt,
);
await this.outboxRepository.save({
aggregateType: ContributionRecordSyncedEvent.AGGREGATE_TYPE,
aggregateId: record.id!.toString(),
eventType: ContributionRecordSyncedEvent.EVENT_TYPE,
payload: event.toPayload(),
});
}
// 2. 发布补发事件到 mining-wallet-service
const userContributions = savedRecords.map((record, index) => ({
accountSequence: record.accountSequence,
contributionType: 'TEAM_BONUS',
amount: record.amount.value.toString(),
bonusTier: record.bonusTier,
effectiveDate: record.effectiveDate.toISOString(),
expireDate: record.expireDate.toISOString(),
sourceAdoptionId: record.sourceAdoptionId.toString(),
sourceAccountSequence: record.sourceAccountSequence,
isBackfill: true, // 标记为补发
}));
const eventId = `bonus-claim-${accountSequence}-${Date.now()}`;
const payload = {
eventType: 'BonusClaimed',
eventId,
timestamp: new Date().toISOString(),
payload: {
accountSequence,
bonusTier: savedRecords[0]?.bonusTier,
claimedCount: savedRecords.length,
userContributions,
},
};
await this.outboxRepository.save({
eventType: 'BonusClaimed',
topic: 'contribution.bonus.claimed',
key: accountSequence,
payload,
aggregateId: accountSequence,
aggregateType: 'ContributionAccount',
});
}
}

View File

@ -12,7 +12,8 @@ import { ContributionRecordAggregate } from '../../domain/aggregates/contributio
import { SyncedReferral } from '../../domain/repositories/synced-data.repository.interface';
import { ContributionDistributionPublisherService } from './contribution-distribution-publisher.service';
import { ContributionRateService } from './contribution-rate.service';
import { ContributionRecordSyncedEvent, NetworkProgressUpdatedEvent } from '../../domain/events';
import { BonusClaimService } from './bonus-claim.service';
import { ContributionRecordSyncedEvent, NetworkProgressUpdatedEvent, ContributionAccountUpdatedEvent } from '../../domain/events';
/**
*
@ -33,6 +34,7 @@ export class ContributionCalculationService {
private readonly unitOfWork: UnitOfWork,
private readonly distributionPublisher: ContributionDistributionPublisherService,
private readonly contributionRateService: ContributionRateService,
private readonly bonusClaimService: BonusClaimService,
) {}
/**
@ -164,6 +166,8 @@ export class ContributionCalculationService {
): Promise<void> {
// 收集所有保存后的记录带ID用于发布事件
const savedRecords: ContributionRecordAggregate[] = [];
// 收集所有被更新的账户序列号(用于发布账户更新事件)
const updatedAccountSequences = new Set<string>();
// 1. 保存个人算力记录
const savedPersonalRecord = await this.contributionRecordRepository.save(result.personalRecord);
@ -178,6 +182,7 @@ export class ContributionCalculationService {
}
account.addPersonalContribution(result.personalRecord.amount);
await this.contributionAccountRepository.save(account);
updatedAccountSequences.add(result.personalRecord.accountSequence);
// 2. 保存团队层级算力记录
if (result.teamLevelRecords.length > 0) {
@ -193,6 +198,7 @@ export class ContributionCalculationService {
record.levelDepth, // 传递层级深度
null,
);
updatedAccountSequences.add(record.accountSequence);
}
}
@ -210,6 +216,7 @@ export class ContributionCalculationService {
null,
record.bonusTier, // 传递加成档位
);
updatedAccountSequences.add(record.accountSequence);
}
}
@ -250,6 +257,23 @@ export class ContributionCalculationService {
// 6. 发布算力记录同步事件(用于 mining-admin-service- 使用保存后带 ID 的记录
await this.publishContributionRecordEvents(savedRecords);
// 7. 发布所有被更新账户的事件(用于 CDC 同步到 mining-admin-service
await this.publishUpdatedAccountEvents(updatedAccountSequences);
}
/**
*
*/
private async publishUpdatedAccountEvents(accountSequences: Set<string>): Promise<void> {
if (accountSequences.size === 0) return;
for (const accountSequence of accountSequences) {
const account = await this.contributionAccountRepository.findByAccountSequence(accountSequence);
if (account) {
await this.publishContributionAccountUpdatedEvent(account);
}
}
}
/**
@ -300,11 +324,15 @@ export class ContributionCalculationService {
if (!account.hasAdopted) {
account.markAsAdopted();
await this.contributionAccountRepository.save(account);
// 发布账户更新事件到 outbox用于 CDC 同步到 mining-admin-service
await this.publishContributionAccountUpdatedEvent(account);
}
}
/**
* 线
*
*/
private async updateReferrerUnlockStatus(referrerAccountSequence: string): Promise<void> {
const account = await this.contributionAccountRepository.findByAccountSequence(referrerAccountSequence);
@ -316,16 +344,27 @@ export class ContributionCalculationService {
);
// 更新解锁状态
const currentCount = account.directReferralAdoptedCount;
if (directReferralAdoptedCount > currentCount) {
const previousCount = account.directReferralAdoptedCount;
if (directReferralAdoptedCount > previousCount) {
// 需要增量更新
for (let i = currentCount; i < directReferralAdoptedCount; i++) {
for (let i = previousCount; i < directReferralAdoptedCount; i++) {
account.incrementDirectReferralAdoptedCount();
}
await this.contributionAccountRepository.save(account);
// 发布账户更新事件到 outbox用于 CDC 同步到 mining-admin-service
await this.publishContributionAccountUpdatedEvent(account);
this.logger.debug(
`Updated referrer ${referrerAccountSequence} unlock status: level=${account.unlockedLevelDepth}, bonus=${account.unlockedBonusTiers}`,
);
// 检查并处理奖励补发T2: 直推≥2人, T3: 直推≥4人
await this.bonusClaimService.checkAndClaimBonus(
referrerAccountSequence,
previousCount,
directReferralAdoptedCount,
);
}
}
@ -393,4 +432,43 @@ export class ContributionCalculationService {
},
};
}
/**
* CDC mining-admin-service
*/
private async publishContributionAccountUpdatedEvent(
account: ContributionAccountAggregate,
): Promise<void> {
// 总算力 = 个人算力 + 层级待解锁 + 加成待解锁
const totalContribution = account.personalContribution.value
.plus(account.totalLevelPending.value)
.plus(account.totalBonusPending.value);
const event = new ContributionAccountUpdatedEvent(
account.accountSequence,
account.personalContribution.value.toString(),
account.totalLevelPending.value.toString(),
account.totalBonusPending.value.toString(),
totalContribution.toString(),
account.effectiveContribution.value.toString(),
account.hasAdopted,
account.directReferralAdoptedCount,
account.unlockedLevelDepth,
account.unlockedBonusTiers,
account.createdAt,
);
await this.outboxRepository.save({
aggregateType: ContributionAccountUpdatedEvent.AGGREGATE_TYPE,
aggregateId: account.accountSequence,
eventType: ContributionAccountUpdatedEvent.EVENT_TYPE,
payload: event.toPayload(),
});
this.logger.debug(
`Published ContributionAccountUpdatedEvent for ${account.accountSequence}: ` +
`directReferralAdoptedCount=${account.directReferralAdoptedCount}, ` +
`hasAdopted=${account.hasAdopted}`,
);
}
}

View File

@ -0,0 +1,40 @@
/**
*
* directReferralAdoptedCount, unlockedLevelDepth, unlockedBonusTiers
* mining-admin-service
*/
export class ContributionAccountUpdatedEvent {
static readonly EVENT_TYPE = 'ContributionAccountUpdated';
static readonly AGGREGATE_TYPE = 'ContributionAccount';
constructor(
public readonly accountSequence: string,
public readonly personalContribution: string,
public readonly teamLevelContribution: string,
public readonly teamBonusContribution: string,
public readonly totalContribution: string,
public readonly effectiveContribution: string,
public readonly hasAdopted: boolean,
public readonly directReferralAdoptedCount: number,
public readonly unlockedLevelDepth: number,
public readonly unlockedBonusTiers: number,
public readonly createdAt: Date,
) {}
toPayload(): Record<string, any> {
return {
eventType: ContributionAccountUpdatedEvent.EVENT_TYPE,
accountSequence: this.accountSequence,
personalContribution: this.personalContribution,
teamLevelContribution: this.teamLevelContribution,
teamBonusContribution: this.teamBonusContribution,
totalContribution: this.totalContribution,
effectiveContribution: this.effectiveContribution,
hasAdopted: this.hasAdopted,
directReferralAdoptedCount: this.directReferralAdoptedCount,
unlockedLevelDepth: this.unlockedLevelDepth,
unlockedBonusTiers: this.unlockedBonusTiers,
createdAt: this.createdAt.toISOString(),
};
}
}

View File

@ -1,6 +1,7 @@
export * from './contribution-calculated.event';
export * from './daily-snapshot-created.event';
export * from './contribution-account-synced.event';
export * from './contribution-account-updated.event';
export * from './referral-synced.event';
export * from './adoption-synced.event';
export * from './contribution-record-synced.event';

View File

@ -53,6 +53,12 @@ export type TransactionalCDCHandlerWithResult<T> = (event: CDCEvent, tx: Transac
/** 事务提交后的回调函数 */
export type PostCommitCallback<T> = (result: T) => Promise<void>;
/** Topic 消费阶段配置 */
export interface TopicPhase {
topic: string;
tableName: string;
}
@Injectable()
export class CDCConsumerService implements OnModuleInit, OnModuleDestroy {
private readonly logger = new Logger(CDCConsumerService.name);
@ -61,6 +67,14 @@ export class CDCConsumerService implements OnModuleInit, OnModuleDestroy {
private handlers: Map<string, CDCHandler> = new Map();
private isRunning = false;
// 分阶段消费配置
private topicPhases: TopicPhase[] = [];
private currentPhaseIndex = 0;
private sequentialMode = false;
// 初始同步完成标记(只有顺序同步全部完成后才为 true
private initialSyncCompleted = false;
constructor(
private readonly configService: ConfigService,
private readonly prisma: PrismaService,
@ -247,7 +261,14 @@ export class CDCConsumerService implements OnModuleInit, OnModuleDestroy {
}
/**
*
*
*
* topic
* 1. (user_accounts)
* 2. (referral_relationships) -
* 3. (planting_orders) -
*
*
*/
async start(): Promise<void> {
if (this.isRunning) {
@ -259,36 +280,213 @@ export class CDCConsumerService implements OnModuleInit, OnModuleDestroy {
await this.consumer.connect();
this.logger.log('CDC consumer connected');
// 订阅 Debezium CDC topics (从1.0服务全量同步)
const topics = [
// 用户账户表 (identity-service: user_accounts)
this.configService.get<string>('CDC_TOPIC_USERS', 'cdc.identity.public.user_accounts'),
// 认种订单表 (planting-service: planting_orders)
this.configService.get<string>('CDC_TOPIC_ADOPTIONS', 'cdc.planting.public.planting_orders'),
// 推荐关系表 (referral-service: referral_relationships)
this.configService.get<string>('CDC_TOPIC_REFERRALS', 'cdc.referral.public.referral_relationships'),
// 配置顺序消费阶段(顺序很重要!)
this.topicPhases = [
{
topic: this.configService.get<string>('CDC_TOPIC_USERS', 'cdc.identity.public.user_accounts'),
tableName: 'user_accounts',
},
{
topic: this.configService.get<string>('CDC_TOPIC_REFERRALS', 'cdc.referral.public.referral_relationships'),
tableName: 'referral_relationships',
},
{
topic: this.configService.get<string>('CDC_TOPIC_ADOPTIONS', 'cdc.planting.public.planting_orders'),
tableName: 'planting_orders',
},
];
await this.consumer.subscribe({
topics,
fromBeginning: true, // 首次启动时全量同步历史数据
});
this.logger.log(`Subscribed to topics: ${topics.join(', ')}`);
await this.consumer.run({
eachMessage: async (payload: EachMessagePayload) => {
await this.handleMessage(payload);
},
});
this.currentPhaseIndex = 0;
this.sequentialMode = true;
this.isRunning = true;
this.logger.log('CDC consumer started with transactional idempotency protection');
// 开始顺序消费(阻塞直到完成,确保数据依赖顺序正确)
await this.startSequentialConsumption();
this.logger.log('CDC consumer started with sequential phase consumption');
} catch (error) {
this.logger.error('Failed to start CDC consumer', error);
// 不抛出错误,允许服务在没有 Kafka 的情况下启动(用于本地开发)
}
}
/**
*
*/
private async startSequentialConsumption(): Promise<void> {
for (let i = 0; i < this.topicPhases.length; i++) {
this.currentPhaseIndex = i;
const phase = this.topicPhases[i];
this.logger.log(`[CDC] Starting phase ${i + 1}/${this.topicPhases.length}: ${phase.tableName} (${phase.topic})`);
// 消费当前阶段直到追上最新
await this.consumePhaseToEnd(phase);
this.logger.log(`[CDC] Completed phase ${i + 1}/${this.topicPhases.length}: ${phase.tableName}`);
}
this.logger.log('[CDC] All phases completed. Switching to continuous mode...');
// 所有阶段完成后,切换到持续消费模式(同时监听所有 topic
await this.startContinuousMode();
}
/**
*
*/
private async consumePhaseToEnd(phase: TopicPhase): Promise<void> {
const admin = this.kafka.admin();
await admin.connect();
// 获取 topic 的高水位线和最早 offset
const topicOffsets = await admin.fetchTopicOffsets(phase.topic);
const highWatermarks: Map<number, string> = new Map();
const earliestOffsets: Map<number, string> = new Map();
for (const partitionOffset of topicOffsets) {
highWatermarks.set(partitionOffset.partition, partitionOffset.high);
earliestOffsets.set(partitionOffset.partition, partitionOffset.low);
}
this.logger.log(`[CDC] Phase ${phase.tableName}: High watermarks = ${JSON.stringify(Object.fromEntries(highWatermarks))}`);
// 检查是否 topic 为空
const allEmpty = Array.from(highWatermarks.values()).every(hw => hw === '0');
if (allEmpty) {
this.logger.log(`[CDC] Phase ${phase.tableName}: Topic is empty, skipping`);
await admin.disconnect();
return;
}
// 使用固定的 group id
const phaseGroupId = `contribution-service-cdc-phase-${phase.tableName}`;
// 重置 consumer group 的 offset 到最早位置
// 使用 admin.resetOffsets 而不是 setOffsets更简洁且专门用于重置到 earliest/latest
// 这确保每次服务启动都会从头开始消费,不受之前 committed offset 影响
// 参考: https://kafka.js.org/docs/admin#a-name-reset-offsets-a-resetoffsets
this.logger.log(`[CDC] Phase ${phase.tableName}: Resetting consumer group ${phaseGroupId} offsets to earliest`);
try {
await admin.resetOffsets({
groupId: phaseGroupId,
topic: phase.topic,
earliest: true,
});
this.logger.log(`[CDC] Phase ${phase.tableName}: Consumer group offsets reset successfully`);
} catch (resetError: any) {
// 如果 consumer group 不存在resetOffsets 会失败,这是正常的(首次运行)
// fromBeginning: true 会在这种情况下生效
this.logger.log(`[CDC] Phase ${phase.tableName}: Could not reset offsets (may be first run): ${resetError.message}`);
}
const phaseConsumer = this.kafka.consumer({
groupId: phaseGroupId,
});
try {
await phaseConsumer.connect();
// 订阅单个 topicfromBeginning 对新 group 有效
await phaseConsumer.subscribe({
topic: phase.topic,
fromBeginning: true,
});
let processedOffsets: Map<number, bigint> = new Map();
let isComplete = false;
for (const partition of highWatermarks.keys()) {
processedOffsets.set(partition, BigInt(-1));
}
// 开始消费
await phaseConsumer.run({
eachMessage: async (payload: EachMessagePayload) => {
await this.handleMessage(payload);
// 更新已处理的 offset
processedOffsets.set(payload.partition, BigInt(payload.message.offset));
// 检查是否所有 partition 都已追上高水位线
let allCaughtUp = true;
for (const [partition, highWatermark] of highWatermarks) {
const processed = processedOffsets.get(partition) ?? BigInt(-1);
// 高水位线是下一个将被写入的 offset所以已处理的 offset 需要 >= highWatermark - 1
if (processed < BigInt(highWatermark) - BigInt(1)) {
allCaughtUp = false;
break;
}
}
if (allCaughtUp && !isComplete) {
isComplete = true;
this.logger.log(`[CDC] Phase ${phase.tableName}: Caught up with all partitions`);
}
},
});
// 等待追上高水位线
while (!isComplete) {
await new Promise(resolve => setTimeout(resolve, 100));
// 每秒检查一次进度
const currentProgress = Array.from(processedOffsets.entries())
.map(([p, o]) => `P${p}:${o}/${highWatermarks.get(p)}`)
.join(', ');
this.logger.debug(`[CDC] Phase ${phase.tableName} progress: ${currentProgress}`);
}
// 停止消费
await phaseConsumer.stop();
await phaseConsumer.disconnect();
await admin.disconnect();
} catch (error) {
this.logger.error(`[CDC] Error in phase ${phase.tableName}`, error);
await phaseConsumer.disconnect();
await admin.disconnect();
throw error;
}
}
/**
* topic
*/
private async startContinuousMode(): Promise<void> {
this.sequentialMode = false;
this.initialSyncCompleted = true; // 标记初始同步完成
const topics = this.topicPhases.map(p => p.topic);
await this.consumer.subscribe({
topics,
fromBeginning: false, // 从上次消费的位置继续(不是从头开始)
});
this.logger.log(`[CDC] Continuous mode: Subscribed to topics: ${topics.join(', ')}`);
await this.consumer.run({
eachMessage: async (payload: EachMessagePayload) => {
await this.handleMessage(payload);
},
});
this.logger.log('[CDC] Continuous mode started - all topics being consumed in parallel');
}
/**
* CDC
* - initialSyncCompleted = true:
*/
getSyncStatus(): { isRunning: boolean; sequentialMode: boolean; allPhasesCompleted: boolean } {
return {
isRunning: this.isRunning,
sequentialMode: this.sequentialMode,
allPhasesCompleted: this.initialSyncCompleted,
};
}
/**
*
*/

View File

@ -223,6 +223,16 @@ export class ContributionAccountRepository implements IContributionAccountReposi
});
}
async findRecentlyUpdated(since: Date, limit: number = 500): Promise<ContributionAccountAggregate[]> {
const records = await this.client.contributionAccount.findMany({
where: { updatedAt: { gte: since } },
orderBy: { updatedAt: 'desc' },
take: limit,
});
return records.map((r) => this.toDomain(r));
}
private toDomain(record: any): ContributionAccountAggregate {
return ContributionAccountAggregate.fromPersistence({
id: record.id,

View File

@ -136,7 +136,10 @@ export class SyncedDataRepository implements ISyncedDataRepository {
async findUndistributedAdoptions(limit: number = 100): Promise<SyncedAdoption[]> {
const records = await this.client.syncedAdoption.findMany({
where: { contributionDistributed: false },
where: {
contributionDistributed: false,
status: 'MINING_ENABLED', // 只处理最终成功的认种订单
},
orderBy: { adoptionDate: 'asc' },
take: limit,
});
@ -171,7 +174,10 @@ export class SyncedDataRepository implements ISyncedDataRepository {
async getTotalTreesByAccountSequence(accountSequence: string): Promise<number> {
const result = await this.client.syncedAdoption.aggregate({
where: { accountSequence },
where: {
accountSequence,
status: 'MINING_ENABLED', // 只统计最终成功的认种订单
},
_sum: { treeCount: true },
});
return result._sum.treeCount ?? 0;
@ -285,8 +291,12 @@ export class SyncedDataRepository implements ISyncedDataRepository {
const accountSequences = directReferrals.map((r) => r.accountSequence);
// 只统计有 MINING_ENABLED 状态认种记录的直推用户数
const adoptedCount = await this.client.syncedAdoption.findMany({
where: { accountSequence: { in: accountSequences } },
where: {
accountSequence: { in: accountSequences },
status: 'MINING_ENABLED', // 只统计最终成功的认种订单
},
distinct: ['accountSequence'],
});
@ -308,7 +318,10 @@ export class SyncedDataRepository implements ISyncedDataRepository {
const adoptions = await this.client.syncedAdoption.groupBy({
by: ['accountSequence'],
where: { accountSequence: { in: sequences } },
where: {
accountSequence: { in: sequences },
status: 'MINING_ENABLED', // 只统计最终成功的认种订单
},
_sum: { treeCount: true },
});
@ -346,6 +359,89 @@ export class SyncedDataRepository implements ISyncedDataRepository {
return result;
}
// ========== 认种分类账查询 ==========
async getPlantingLedger(
accountSequence: string,
page: number = 1,
pageSize: number = 20,
): Promise<{
items: SyncedAdoption[];
total: number;
page: number;
pageSize: number;
totalPages: number;
}> {
const skip = (page - 1) * pageSize;
// 只返回 MINING_ENABLED 状态的认种记录
const whereClause = { accountSequence, status: 'MINING_ENABLED' };
const [items, total] = await Promise.all([
this.client.syncedAdoption.findMany({
where: whereClause,
orderBy: { adoptionDate: 'desc' },
skip,
take: pageSize,
}),
this.client.syncedAdoption.count({
where: whereClause,
}),
]);
return {
items: items.map((r) => this.toSyncedAdoption(r)),
total,
page,
pageSize,
totalPages: Math.ceil(total / pageSize),
};
}
async getPlantingSummary(accountSequence: string): Promise<{
totalOrders: number;
totalTreeCount: number;
totalAmount: string;
effectiveTreeCount: number;
firstPlantingAt: Date | null;
lastPlantingAt: Date | null;
}> {
// 只统计 MINING_ENABLED 状态的认种记录
const adoptions = await this.client.syncedAdoption.findMany({
where: { accountSequence, status: 'MINING_ENABLED' },
orderBy: { adoptionDate: 'asc' },
});
if (adoptions.length === 0) {
return {
totalOrders: 0,
totalTreeCount: 0,
totalAmount: '0',
effectiveTreeCount: 0,
firstPlantingAt: null,
lastPlantingAt: null,
};
}
const totalOrders = adoptions.length;
const totalTreeCount = adoptions.reduce((sum, a) => sum + a.treeCount, 0);
// 计算总金额treeCount * contributionPerTree
let totalAmount = new Decimal(0);
for (const adoption of adoptions) {
const amount = new Decimal(adoption.contributionPerTree).mul(adoption.treeCount);
totalAmount = totalAmount.add(amount);
}
return {
totalOrders,
totalTreeCount,
totalAmount: totalAmount.toString(),
effectiveTreeCount: totalTreeCount, // 全部都是有效的 MINING_ENABLED
firstPlantingAt: adoptions[0]?.adoptionDate || null,
lastPlantingAt: adoptions[adoptions.length - 1]?.adoptionDate || null,
};
}
// ========== 统计方法(用于查询服务)==========
async countUsers(): Promise<number> {
@ -358,7 +454,10 @@ export class SyncedDataRepository implements ISyncedDataRepository {
async countUndistributedAdoptions(): Promise<number> {
return this.client.syncedAdoption.count({
where: { contributionDistributed: false },
where: {
contributionDistributed: false,
status: 'MINING_ENABLED', // 只统计最终成功的认种订单
},
});
}

View File

@ -7,14 +7,16 @@ export interface UnallocatedContribution {
unallocType: string;
wouldBeAccountSequence: string | null;
levelDepth: number | null;
bonusTier: number | null;
amount: ContributionAmount;
reason: string | null;
sourceAdoptionId: bigint;
sourceAccountSequence: string;
effectiveDate: Date;
expireDate: Date;
allocatedToHeadquarters: boolean;
status: string;
allocatedAt: Date | null;
allocatedToAccountSequence: string | null;
createdAt: Date;
}
@ -130,20 +132,82 @@ export class UnallocatedContributionRepository {
};
}
/**
*
* @param accountSequence
* @param bonusTier (2 3)
*/
async findPendingBonusByAccountSequence(
accountSequence: string,
bonusTier: number,
): Promise<UnallocatedContribution[]> {
const records = await this.client.unallocatedContribution.findMany({
where: {
wouldBeAccountSequence: accountSequence,
unallocType: `BONUS_TIER_${bonusTier}`,
status: 'PENDING',
},
orderBy: { createdAt: 'asc' },
});
return records.map((r) => this.toDomain(r));
}
/**
* -
* @param ids ID列表
* @param accountSequence
*/
async claimBonusRecords(ids: bigint[], accountSequence: string): Promise<void> {
if (ids.length === 0) return;
await this.client.unallocatedContribution.updateMany({
where: {
id: { in: ids },
status: 'PENDING',
},
data: {
status: 'ALLOCATED_TO_USER',
allocatedAt: new Date(),
allocatedToAccountSequence: accountSequence,
},
});
}
/**
*
*/
async findAllPendingBonusByAccountSequence(
accountSequence: string,
): Promise<UnallocatedContribution[]> {
const records = await this.client.unallocatedContribution.findMany({
where: {
wouldBeAccountSequence: accountSequence,
unallocType: { startsWith: 'BONUS_TIER_' },
status: 'PENDING',
},
orderBy: { createdAt: 'asc' },
});
return records.map((r) => this.toDomain(r));
}
private toDomain(record: any): UnallocatedContribution {
return {
id: record.id,
unallocType: record.unallocType,
wouldBeAccountSequence: record.wouldBeAccountSequence,
levelDepth: record.levelDepth,
bonusTier: record.bonusTier,
amount: new ContributionAmount(record.amount),
reason: record.reason,
sourceAdoptionId: record.sourceAdoptionId,
sourceAccountSequence: record.sourceAccountSequence,
effectiveDate: record.effectiveDate,
expireDate: record.expireDate,
allocatedToHeadquarters: record.allocatedToHeadquarters,
status: record.status,
allocatedAt: record.allocatedAt,
allocatedToAccountSequence: record.allocatedToAccountSequence,
createdAt: record.createdAt,
};
}

View File

@ -1102,9 +1102,47 @@ full_reset() {
service_start "$service"
done
log_step "Step 10/18: Waiting for services to be ready and sync from 1.0..."
log_info "Waiting 30 seconds for all services to start and sync data from 1.0 CDC..."
sleep 30
log_step "Step 10/18: Waiting for contribution-service CDC sync to complete..."
log_info "Waiting for contribution-service to complete CDC sync (user_accounts -> referral_relationships -> planting_orders)..."
# 等待 contribution-service 的 CDC 顺序同步完成
# 通过 /health/cdc-sync API 检查同步状态
local max_wait=600 # 最多等待 10 分钟
local wait_count=0
local sync_completed=false
local cdc_sync_url="http://localhost:3020/api/v2/health/cdc-sync"
while [ "$wait_count" -lt "$max_wait" ] && [ "$sync_completed" = false ]; do
# 调用 API 检查同步状态
local sync_status
sync_status=$(curl -s "$cdc_sync_url" 2>/dev/null || echo '{}')
if echo "$sync_status" | grep -q '"allPhasesCompleted":true'; then
sync_completed=true
log_success "CDC sync completed - all phases finished"
else
# 显示当前状态
local is_running
local sequential_mode
is_running=$(echo "$sync_status" | grep -o '"isRunning":[^,}]*' | cut -d':' -f2)
sequential_mode=$(echo "$sync_status" | grep -o '"sequentialMode":[^,}]*' | cut -d':' -f2)
if [ "$is_running" = "true" ] && [ "$sequential_mode" = "true" ]; then
log_info "CDC sync in progress (sequential mode)... (waited ${wait_count}s)"
elif [ "$is_running" = "true" ]; then
log_info "CDC consumer running... (waited ${wait_count}s)"
else
log_info "Waiting for CDC consumer to start... (waited ${wait_count}s)"
fi
sleep 5
wait_count=$((wait_count + 5))
fi
done
if [ "$sync_completed" = false ]; then
log_warn "CDC sync did not complete within ${max_wait}s, proceeding anyway..."
log_info "You may need to wait longer or check: curl $cdc_sync_url"
fi
log_step "Step 11/18: Registering Debezium outbox connectors..."
# Register outbox connectors AFTER services are running and have synced data

View File

@ -3,11 +3,11 @@ import { ApplicationModule } from '../application/application.module';
import { AuthController } from './controllers/auth.controller';
import { DashboardController } from './controllers/dashboard.controller';
import { ConfigController } from './controllers/config.controller';
import { InitializationController } from './controllers/initialization.controller';
import { AuditController } from './controllers/audit.controller';
import { HealthController } from './controllers/health.controller';
import { UsersController } from './controllers/users.controller';
import { SystemAccountsController } from './controllers/system-accounts.controller';
import { ReportsController } from './controllers/reports.controller';
@Module({
imports: [ApplicationModule],
@ -15,11 +15,11 @@ import { SystemAccountsController } from './controllers/system-accounts.controll
AuthController,
DashboardController,
ConfigController,
InitializationController,
AuditController,
HealthController,
UsersController,
SystemAccountsController,
ReportsController,
],
})
export class ApiModule {}

View File

@ -4,7 +4,7 @@ import { DashboardService } from '../../application/services/dashboard.service';
@ApiTags('Audit')
@ApiBearerAuth()
@Controller('audit-logs')
@Controller('audit')
export class AuditController {
constructor(private readonly dashboardService: DashboardService) {}
@ -13,15 +13,42 @@ export class AuditController {
@ApiQuery({ name: 'adminId', required: false })
@ApiQuery({ name: 'action', required: false })
@ApiQuery({ name: 'resource', required: false })
@ApiQuery({ name: 'keyword', required: false })
@ApiQuery({ name: 'page', required: false, type: Number })
@ApiQuery({ name: 'pageSize', required: false, type: Number })
async getAuditLogs(
@Query('adminId') adminId?: string,
@Query('action') action?: string,
@Query('resource') resource?: string,
@Query('keyword') keyword?: string,
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
) {
return this.dashboardService.getAuditLogs({ adminId, action, resource, page: page ?? 1, pageSize: pageSize ?? 50 });
const result = await this.dashboardService.getAuditLogs({
adminId,
action,
resource,
page: page ?? 1,
pageSize: pageSize ?? 20,
});
// 转换为前端期望的格式
return {
items: result.data.map((log: any) => ({
id: log.id,
adminId: log.adminId,
adminUsername: log.admin?.username || 'unknown',
action: log.action,
resource: log.resource,
resourceId: log.resourceId,
details: log.newValue ? JSON.stringify(log.newValue) : null,
ipAddress: log.ipAddress || '-',
createdAt: log.createdAt,
})),
total: result.total,
page: result.pagination.page,
pageSize: result.pagination.pageSize,
totalPages: result.pagination.totalPages,
};
}
}

View File

@ -1,5 +1,6 @@
import { Controller, Get, Post, Delete, Body, Param, Query, Req } from '@nestjs/common';
import { Controller, Get, Post, Delete, Body, Param, Query, Req, Logger } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiBearerAuth, ApiQuery, ApiParam } from '@nestjs/swagger';
import { ConfigService } from '@nestjs/config';
import { ConfigManagementService } from '../../application/services/config.service';
class SetConfigDto { category: string; key: string; value: string; description?: string; }
@ -8,7 +9,12 @@ class SetConfigDto { category: string; key: string; value: string; description?:
@ApiBearerAuth()
@Controller('configs')
export class ConfigController {
constructor(private readonly configService: ConfigManagementService) {}
private readonly logger = new Logger(ConfigController.name);
constructor(
private readonly configService: ConfigManagementService,
private readonly appConfigService: ConfigService,
) {}
@Get()
@ApiOperation({ summary: '获取配置列表' })
@ -17,6 +23,90 @@ export class ConfigController {
return this.configService.getConfigs(category);
}
@Get('transfer-enabled')
@ApiOperation({ summary: '获取划转开关状态' })
async getTransferEnabled() {
const config = await this.configService.getConfig('system', 'transfer_enabled');
return { enabled: config?.configValue === 'true' };
}
@Post('transfer-enabled')
@ApiOperation({ summary: '设置划转开关状态' })
async setTransferEnabled(@Body() body: { enabled: boolean }, @Req() req: any) {
await this.configService.setConfig(req.admin.id, 'system', 'transfer_enabled', String(body.enabled), '划转开关');
return { success: true };
}
@Get('mining/status')
@ApiOperation({ summary: '获取挖矿状态' })
async getMiningStatus() {
const miningServiceUrl = this.appConfigService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
this.logger.log(`Fetching mining status from ${miningServiceUrl}/api/v2/admin/status`);
try {
const response = await fetch(`${miningServiceUrl}/api/v2/admin/status`);
if (!response.ok) {
throw new Error(`Failed to fetch mining status: ${response.status}`);
}
const result = await response.json();
this.logger.log(`Mining service response: ${JSON.stringify(result)}`);
if (result.data) {
return result.data;
}
return {
initialized: false,
isActive: false,
error: 'Invalid response from mining service',
};
} catch (error) {
this.logger.error('Failed to get mining status', error);
return {
initialized: false,
isActive: false,
error: `Unable to connect to mining service: ${error.message}`,
};
}
}
@Post('mining/activate')
@ApiOperation({ summary: '激活挖矿' })
async activateMining(@Req() req: any) {
const miningServiceUrl = this.appConfigService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
try {
const response = await fetch(`${miningServiceUrl}/api/v2/admin/activate`, {
method: 'POST',
});
if (!response.ok) {
throw new Error('Failed to activate mining');
}
const result = await response.json();
this.logger.log(`Mining activated by admin ${req.admin?.id}`);
return result;
} catch (error) {
this.logger.error('Failed to activate mining', error);
return { success: false, message: 'Failed to activate mining' };
}
}
@Post('mining/deactivate')
@ApiOperation({ summary: '停用挖矿' })
async deactivateMining(@Req() req: any) {
const miningServiceUrl = this.appConfigService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
try {
const response = await fetch(`${miningServiceUrl}/api/v2/admin/deactivate`, {
method: 'POST',
});
if (!response.ok) {
throw new Error('Failed to deactivate mining');
}
const result = await response.json();
this.logger.log(`Mining deactivated by admin ${req.admin?.id}`);
return result;
} catch (error) {
this.logger.error('Failed to deactivate mining', error);
return { success: false, message: 'Failed to deactivate mining' };
}
}
@Get(':category/:key')
@ApiOperation({ summary: '获取单个配置' })
@ApiParam({ name: 'category' })

View File

@ -16,19 +16,54 @@ export class DashboardController {
@Get()
@ApiOperation({ summary: '获取仪表盘统计数据' })
async getStats() {
return this.dashboardService.getDashboardStats();
const raw = await this.dashboardService.getDashboardStats();
// 计算24小时价格变化
let priceChange24h = 0;
if (raw.latestPrice) {
const open = parseFloat(raw.latestPrice.open) || 1;
const close = parseFloat(raw.latestPrice.close) || 1;
priceChange24h = (close - open) / open;
}
// 转换为前端期望的格式
return {
totalUsers: raw.users?.total || 0,
adoptedUsers: raw.users?.adopted || 0,
totalTrees: raw.contribution?.totalTrees || 0,
networkEffectiveContribution: raw.contribution?.effectiveContribution || '0',
networkTotalContribution: raw.contribution?.totalContribution || '0',
networkLevelPending: raw.contribution?.teamLevelContribution || '0',
networkBonusPending: raw.contribution?.teamBonusContribution || '0',
totalDistributed: raw.mining?.totalMined || '0',
totalBurned: raw.mining?.latestDailyStat?.totalBurned || '0',
circulationPool: raw.trading?.circulationPool?.totalShares || '0',
currentPrice: raw.latestPrice?.close || '1',
priceChange24h,
totalOrders: raw.trading?.totalAccounts || 0,
totalTrades: raw.trading?.totalAccounts || 0,
};
}
@Get('stats')
@ApiOperation({ summary: '获取仪表盘统计数据(别名)' })
async getStatsAlias() {
return this.dashboardService.getDashboardStats();
return this.getStats();
}
@Get('realtime')
@ApiOperation({ summary: '获取实时数据' })
async getRealtimeStats() {
return this.dashboardService.getRealtimeStats();
const raw = await this.dashboardService.getRealtimeStats();
// 转换为前端期望的格式
return {
currentMinuteDistribution: raw.minuteDistribution || '0',
currentMinuteBurn: '0', // 暂无实时销毁数据
activeOrders: 0, // 暂无实时订单数据
pendingTrades: 0, // 暂无待处理交易数据
lastPriceUpdateAt: raw.timestamp,
};
}
@Get('reports')

View File

@ -1,77 +0,0 @@
import { Controller, Post, Body, Req } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiBearerAuth } from '@nestjs/swagger';
import { InitializationService } from '../../application/services/initialization.service';
class InitMiningConfigDto {
totalShares: string;
distributionPool: string;
halvingPeriodYears: number;
burnTarget: string;
}
@ApiTags('Initialization')
@ApiBearerAuth()
@Controller('initialization')
export class InitializationController {
constructor(private readonly initService: InitializationService) {}
@Post('mining-config')
@ApiOperation({ summary: '初始化挖矿配置' })
async initMiningConfig(@Body() dto: InitMiningConfigDto, @Req() req: any) {
return this.initService.initializeMiningConfig(req.admin.id, dto);
}
@Post('system-accounts')
@ApiOperation({ summary: '初始化系统账户' })
async initSystemAccounts(@Req() req: any) {
return this.initService.initializeSystemAccounts(req.admin.id);
}
@Post('activate-mining')
@ApiOperation({ summary: '激活挖矿' })
async activateMining(@Req() req: any) {
return this.initService.activateMining(req.admin.id);
}
@Post('sync-users')
@ApiOperation({ summary: '同步所有用户数据从auth-service初始同步' })
async syncUsers(@Req() req: any) {
return this.initService.syncAllUsers(req.admin.id);
}
@Post('sync-contribution-accounts')
@ApiOperation({ summary: '同步所有算力账户从contribution-service初始同步' })
async syncContributionAccounts(@Req() req: any) {
return this.initService.syncAllContributionAccounts(req.admin.id);
}
@Post('sync-mining-accounts')
@ApiOperation({ summary: '同步所有挖矿账户从mining-service初始同步' })
async syncMiningAccounts(@Req() req: any) {
return this.initService.syncAllMiningAccounts(req.admin.id);
}
@Post('sync-trading-accounts')
@ApiOperation({ summary: '同步所有交易账户从trading-service初始同步' })
async syncTradingAccounts(@Req() req: any) {
return this.initService.syncAllTradingAccounts(req.admin.id);
}
@Post('sync-all')
@ApiOperation({ summary: '执行完整的数据同步(用户+算力+挖矿+交易)' })
async syncAll(@Req() req: any) {
const adminId = req.admin.id;
const results = {
users: await this.initService.syncAllUsers(adminId),
contribution: await this.initService.syncAllContributionAccounts(adminId),
mining: await this.initService.syncAllMiningAccounts(adminId),
trading: await this.initService.syncAllTradingAccounts(adminId),
};
return {
success: true,
message: '全部同步完成',
details: results,
};
}
}

View File

@ -0,0 +1,59 @@
import { Controller, Get, Query } from '@nestjs/common';
import {
ApiTags,
ApiOperation,
ApiBearerAuth,
ApiQuery,
} from '@nestjs/swagger';
import { DashboardService } from '../../application/services/dashboard.service';
@ApiTags('Reports')
@ApiBearerAuth()
@Controller('reports')
export class ReportsController {
constructor(private readonly dashboardService: DashboardService) {}
@Get('daily')
@ApiOperation({ summary: '获取每日报表' })
@ApiQuery({ name: 'page', required: false, type: Number })
@ApiQuery({ name: 'pageSize', required: false, type: Number })
@ApiQuery({ name: 'days', required: false, type: Number })
async getDailyReports(
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
@Query('days') days?: number,
) {
const result = await this.dashboardService.getReports(
page ?? 1,
pageSize ?? 30,
);
// 转换为前端期望的格式
return {
items: result.data.map((report: any) => ({
id: report.id,
reportDate: report.reportDate,
totalUsers: report.users?.total || 0,
newUsers: report.users?.new || 0,
adoptedUsers: report.adoptions?.total || 0,
newAdoptedUsers: report.adoptions?.new || 0,
totalContribution: report.contribution?.total || '0',
newContribution: report.contribution?.growth || '0',
totalDistributed: report.mining?.distributed || '0',
dailyDistributed: report.mining?.distributed || '0',
totalBurned: report.mining?.burned || '0',
dailyBurned: report.mining?.burned || '0',
openPrice: report.price?.open || '1',
closePrice: report.price?.close || '1',
highPrice: report.price?.high || '1',
lowPrice: report.price?.low || '1',
totalVolume: report.trading?.volume || '0',
dailyVolume: report.trading?.volume || '0',
})),
total: result.total,
page: result.pagination.page,
pageSize: result.pagination.pageSize,
totalPages: result.pagination.totalPages,
};
}
}

View File

@ -2,28 +2,28 @@ import { Module, OnModuleInit } from '@nestjs/common';
import { InfrastructureModule } from '../infrastructure/infrastructure.module';
import { AuthService } from './services/auth.service';
import { ConfigManagementService } from './services/config.service';
import { InitializationService } from './services/initialization.service';
import { DashboardService } from './services/dashboard.service';
import { UsersService } from './services/users.service';
import { SystemAccountsService } from './services/system-accounts.service';
import { DailyReportService } from './services/daily-report.service';
@Module({
imports: [InfrastructureModule],
providers: [
AuthService,
ConfigManagementService,
InitializationService,
DashboardService,
UsersService,
SystemAccountsService,
DailyReportService,
],
exports: [
AuthService,
ConfigManagementService,
InitializationService,
DashboardService,
UsersService,
SystemAccountsService,
DailyReportService,
],
})
export class ApplicationModule implements OnModuleInit {

View File

@ -0,0 +1,264 @@
import { Injectable, Logger, OnModuleInit } from '@nestjs/common';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
import Decimal from 'decimal.js';
@Injectable()
export class DailyReportService implements OnModuleInit {
private readonly logger = new Logger(DailyReportService.name);
private reportInterval: NodeJS.Timeout | null = null;
constructor(private readonly prisma: PrismaService) {}
async onModuleInit() {
// 启动时先生成一次报表
await this.generateTodayReport();
// 每小时检查并更新当日报表
this.reportInterval = setInterval(
() => this.generateTodayReport(),
60 * 60 * 1000, // 1 hour
);
this.logger.log('Daily report service initialized');
}
/**
*
*/
async generateTodayReport(): Promise<void> {
const today = new Date();
today.setHours(0, 0, 0, 0);
try {
this.logger.log(`Generating daily report for ${today.toISOString().split('T')[0]}`);
// 收集各项统计数据
const [
userStats,
adoptionStats,
contributionStats,
miningStats,
tradingStats,
priceStats,
] = await Promise.all([
this.getUserStats(today),
this.getAdoptionStats(today),
this.getContributionStats(today),
this.getMiningStats(),
this.getTradingStats(today),
this.getPriceStats(today),
]);
// 更新或创建今日报表
await this.prisma.dailyReport.upsert({
where: { reportDate: today },
create: {
reportDate: today,
...userStats,
...adoptionStats,
...contributionStats,
...miningStats,
...tradingStats,
...priceStats,
},
update: {
...userStats,
...adoptionStats,
...contributionStats,
...miningStats,
...tradingStats,
...priceStats,
},
});
this.logger.log(`Daily report generated successfully for ${today.toISOString().split('T')[0]}`);
} catch (error) {
this.logger.error('Failed to generate daily report', error);
}
}
/**
*
*/
async generateHistoricalReport(date: Date): Promise<void> {
const reportDate = new Date(date);
reportDate.setHours(0, 0, 0, 0);
const [
userStats,
adoptionStats,
contributionStats,
miningStats,
tradingStats,
priceStats,
] = await Promise.all([
this.getUserStats(reportDate),
this.getAdoptionStats(reportDate),
this.getContributionStats(reportDate),
this.getMiningStats(),
this.getTradingStats(reportDate),
this.getPriceStats(reportDate),
]);
await this.prisma.dailyReport.upsert({
where: { reportDate },
create: {
reportDate,
...userStats,
...adoptionStats,
...contributionStats,
...miningStats,
...tradingStats,
...priceStats,
},
update: {
...userStats,
...adoptionStats,
...contributionStats,
...miningStats,
...tradingStats,
...priceStats,
},
});
}
/**
*
*/
private async getUserStats(date: Date) {
const nextDay = new Date(date);
nextDay.setDate(nextDay.getDate() + 1);
const [totalUsers, newUsers] = await Promise.all([
this.prisma.syncedUser.count({
where: { createdAt: { lt: nextDay } },
}),
this.prisma.syncedUser.count({
where: {
createdAt: { gte: date, lt: nextDay },
},
}),
]);
// 活跃用户暂时用总用户数(需要有活跃度跟踪才能准确计算)
const activeUsers = totalUsers;
return {
totalUsers,
newUsers,
activeUsers,
};
}
/**
*
*/
private async getAdoptionStats(date: Date) {
const nextDay = new Date(date);
nextDay.setDate(nextDay.getDate() + 1);
const [totalAdoptions, newAdoptions, treesResult] = await Promise.all([
this.prisma.syncedAdoption.count({
where: { adoptionDate: { lt: nextDay } },
}),
this.prisma.syncedAdoption.count({
where: {
adoptionDate: { gte: date, lt: nextDay },
},
}),
this.prisma.syncedAdoption.aggregate({
where: { adoptionDate: { lt: nextDay } },
_sum: { treeCount: true },
}),
]);
return {
totalAdoptions,
newAdoptions,
totalTrees: treesResult._sum.treeCount || 0,
};
}
/**
*
*/
private async getContributionStats(date: Date) {
// 获取全网算力进度
const networkProgress = await this.prisma.syncedNetworkProgress.findFirst();
// 获取用户算力汇总
const userContribution = await this.prisma.syncedContributionAccount.aggregate({
_sum: {
totalContribution: true,
effectiveContribution: true,
},
});
const totalContribution = new Decimal(
userContribution._sum.totalContribution?.toString() || '0',
);
// 获取昨日报表计算增长
const yesterday = new Date(date);
yesterday.setDate(yesterday.getDate() - 1);
const yesterdayReport = await this.prisma.dailyReport.findUnique({
where: { reportDate: yesterday },
});
const contributionGrowth = yesterdayReport
? totalContribution.minus(new Decimal(yesterdayReport.totalContribution.toString()))
: totalContribution;
return {
totalContribution,
contributionGrowth: contributionGrowth.gt(0) ? contributionGrowth : new Decimal(0),
};
}
/**
*
*/
private async getMiningStats() {
const dailyStat = await this.prisma.syncedDailyMiningStat.findFirst({
orderBy: { statDate: 'desc' },
});
return {
totalDistributed: dailyStat?.totalDistributed || new Decimal(0),
totalBurned: dailyStat?.totalBurned || new Decimal(0),
};
}
/**
*
*/
private async getTradingStats(date: Date) {
const kline = await this.prisma.syncedDayKLine.findUnique({
where: { klineDate: date },
});
return {
tradingVolume: kline?.volume || new Decimal(0),
tradingAmount: kline?.amount || new Decimal(0),
tradeCount: kline?.tradeCount || 0,
};
}
/**
*
*/
private async getPriceStats(date: Date) {
const kline = await this.prisma.syncedDayKLine.findUnique({
where: { klineDate: date },
});
const defaultPrice = new Decimal(1);
return {
openPrice: kline?.open || defaultPrice,
closePrice: kline?.close || defaultPrice,
highPrice: kline?.high || defaultPrice,
lowPrice: kline?.low || defaultPrice,
};
}
}

View File

@ -112,22 +112,26 @@ export class DashboardService {
*
*/
private async getContributionStats() {
const accounts = await this.prisma.syncedContributionAccount.aggregate({
_sum: {
totalContribution: true,
effectiveContribution: true,
personalContribution: true,
teamLevelContribution: true,
teamBonusContribution: true,
},
_count: true,
});
const systemContributions =
await this.prisma.syncedSystemContribution.aggregate({
const [accounts, systemContributions, adoptionStats] = await Promise.all([
this.prisma.syncedContributionAccount.aggregate({
_sum: {
totalContribution: true,
effectiveContribution: true,
personalContribution: true,
teamLevelContribution: true,
teamBonusContribution: true,
},
_count: true,
}),
this.prisma.syncedSystemContribution.aggregate({
_sum: { contributionBalance: true },
_count: true,
});
}),
this.prisma.syncedAdoption.aggregate({
_sum: { treeCount: true },
_count: true,
}),
]);
return {
totalAccounts: accounts._count,
@ -143,6 +147,8 @@ export class DashboardService {
systemAccounts: systemContributions._count,
systemContribution:
systemContributions._sum.contributionBalance?.toString() || '0',
totalAdoptions: adoptionStats._count,
totalTrees: adoptionStats._sum.treeCount || 0,
};
}

View File

@ -1,304 +0,0 @@
import { Injectable, Logger } from '@nestjs/common';
import { ConfigService } from '@nestjs/config';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
@Injectable()
export class InitializationService {
private readonly logger = new Logger(InitializationService.name);
constructor(
private readonly prisma: PrismaService,
private readonly configService: ConfigService,
) {}
async initializeMiningConfig(
adminId: string,
config: {
totalShares: string;
distributionPool: string;
halvingPeriodYears: number;
burnTarget: string;
},
): Promise<{ success: boolean; message: string }> {
const record = await this.prisma.initializationRecord.create({
data: { type: 'MINING_CONFIG', status: 'PENDING', config, executedBy: adminId },
});
try {
const miningServiceUrl = this.configService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
const response = await fetch(`${miningServiceUrl}/api/v1/admin/initialize`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(config),
});
if (!response.ok) {
throw new Error('Failed to initialize mining config');
}
await this.prisma.initializationRecord.update({
where: { id: record.id },
data: { status: 'COMPLETED', executedAt: new Date() },
});
await this.prisma.auditLog.create({
data: { adminId, action: 'INIT', resource: 'MINING', resourceId: record.id, newValue: config },
});
return { success: true, message: 'Mining config initialized successfully' };
} catch (error: any) {
await this.prisma.initializationRecord.update({
where: { id: record.id },
data: { status: 'FAILED', errorMessage: error.message },
});
return { success: false, message: error.message };
}
}
async initializeSystemAccounts(adminId: string): Promise<{ success: boolean; message: string }> {
const accounts = [
{ accountType: 'OPERATION', name: '运营账户', description: '12% 运营收入' },
{ accountType: 'PROVINCE', name: '省公司账户', description: '1% 省公司收入' },
{ accountType: 'CITY', name: '市公司账户', description: '2% 市公司收入' },
];
for (const account of accounts) {
await this.prisma.systemAccount.upsert({
where: { accountType: account.accountType },
create: account,
update: { name: account.name, description: account.description },
});
}
await this.prisma.auditLog.create({
data: { adminId, action: 'INIT', resource: 'SYSTEM_ACCOUNT', newValue: accounts },
});
return { success: true, message: 'System accounts initialized successfully' };
}
async activateMining(adminId: string): Promise<{ success: boolean; message: string }> {
try {
const miningServiceUrl = this.configService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
const response = await fetch(`${miningServiceUrl}/api/v1/admin/activate`, { method: 'POST' });
if (!response.ok) {
throw new Error('Failed to activate mining');
}
await this.prisma.auditLog.create({
data: { adminId, action: 'INIT', resource: 'MINING', newValue: { action: 'ACTIVATE' } },
});
return { success: true, message: 'Mining activated successfully' };
} catch (error: any) {
return { success: false, message: error.message };
}
}
async syncAllUsers(adminId: string): Promise<{ success: boolean; message: string; syncedCount?: number }> {
try {
const authServiceUrl = this.configService.get<string>('AUTH_SERVICE_URL', 'http://localhost:3024');
const response = await fetch(`${authServiceUrl}/api/v2/admin/users/sync`);
if (!response.ok) {
throw new Error(`Failed to fetch users: ${response.statusText}`);
}
const responseData = await response.json();
const users = responseData.data?.users || responseData.users || [];
let syncedCount = 0;
for (const user of users) {
try {
await this.prisma.syncedUser.upsert({
where: { accountSequence: user.accountSequence },
create: {
originalUserId: user.id || user.accountSequence,
accountSequence: user.accountSequence,
phone: user.phone,
status: user.status || 'ACTIVE',
kycStatus: user.kycStatus || 'PENDING',
realName: user.realName || null,
isLegacyUser: user.isLegacyUser || false,
createdAt: new Date(user.createdAt),
},
update: {
phone: user.phone,
status: user.status || 'ACTIVE',
kycStatus: user.kycStatus || 'PENDING',
realName: user.realName || null,
},
});
syncedCount++;
} catch (err) {
this.logger.warn(`Failed to sync user ${user.accountSequence}: ${err}`);
}
}
await this.prisma.auditLog.create({
data: { adminId, action: 'SYNC', resource: 'USER', newValue: { syncedCount } },
});
return { success: true, message: `Synced ${syncedCount} users`, syncedCount };
} catch (error: any) {
return { success: false, message: error.message };
}
}
async syncAllContributionAccounts(adminId: string): Promise<{ success: boolean; message: string; syncedCount?: number }> {
try {
const contributionServiceUrl = this.configService.get<string>('CONTRIBUTION_SERVICE_URL', 'http://localhost:3020');
const response = await fetch(`${contributionServiceUrl}/api/v2/admin/accounts/sync`);
if (!response.ok) {
throw new Error(`Failed to fetch accounts: ${response.statusText}`);
}
const responseData = await response.json();
const accounts = responseData.data?.accounts || responseData.accounts || [];
let syncedCount = 0;
for (const account of accounts) {
try {
await this.prisma.syncedContributionAccount.upsert({
where: { accountSequence: account.accountSequence },
create: {
accountSequence: account.accountSequence,
personalContribution: account.personalContribution || 0,
teamLevelContribution: account.teamLevelContribution || 0,
teamBonusContribution: account.teamBonusContribution || 0,
totalContribution: account.totalContribution || 0,
effectiveContribution: account.effectiveContribution || 0,
hasAdopted: account.hasAdopted || false,
directReferralCount: account.directReferralAdoptedCount || 0,
unlockedLevelDepth: account.unlockedLevelDepth || 0,
unlockedBonusTiers: account.unlockedBonusTiers || 0,
},
update: {
personalContribution: account.personalContribution,
teamLevelContribution: account.teamLevelContribution,
teamBonusContribution: account.teamBonusContribution,
totalContribution: account.totalContribution,
effectiveContribution: account.effectiveContribution,
hasAdopted: account.hasAdopted,
directReferralCount: account.directReferralAdoptedCount,
unlockedLevelDepth: account.unlockedLevelDepth,
unlockedBonusTiers: account.unlockedBonusTiers,
},
});
syncedCount++;
} catch (err) {
this.logger.warn(`Failed to sync account ${account.accountSequence}: ${err}`);
}
}
await this.prisma.auditLog.create({
data: { adminId, action: 'SYNC', resource: 'CONTRIBUTION_ACCOUNT', newValue: { syncedCount } },
});
return { success: true, message: `Synced ${syncedCount} accounts`, syncedCount };
} catch (error: any) {
return { success: false, message: error.message };
}
}
async syncAllMiningAccounts(adminId: string): Promise<{ success: boolean; message: string; syncedCount?: number }> {
try {
const miningServiceUrl = this.configService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
const response = await fetch(`${miningServiceUrl}/api/v1/admin/accounts/sync`);
if (!response.ok) {
throw new Error(`Failed to fetch accounts: ${response.statusText}`);
}
const responseData = await response.json();
const accounts = responseData.data?.accounts || responseData.accounts || [];
let syncedCount = 0;
for (const account of accounts) {
try {
await this.prisma.syncedMiningAccount.upsert({
where: { accountSequence: account.accountSequence },
create: {
accountSequence: account.accountSequence,
totalMined: account.totalMined || 0,
availableBalance: account.availableBalance || 0,
frozenBalance: account.frozenBalance || 0,
totalContribution: account.totalContribution || 0,
},
update: {
totalMined: account.totalMined,
availableBalance: account.availableBalance,
frozenBalance: account.frozenBalance,
totalContribution: account.totalContribution,
},
});
syncedCount++;
} catch (err) {
this.logger.warn(`Failed to sync mining account ${account.accountSequence}: ${err}`);
}
}
await this.prisma.auditLog.create({
data: { adminId, action: 'SYNC', resource: 'MINING_ACCOUNT', newValue: { syncedCount } },
});
return { success: true, message: `Synced ${syncedCount} mining accounts`, syncedCount };
} catch (error: any) {
return { success: false, message: error.message };
}
}
async syncAllTradingAccounts(adminId: string): Promise<{ success: boolean; message: string; syncedCount?: number }> {
try {
const tradingServiceUrl = this.configService.get<string>('TRADING_SERVICE_URL', 'http://localhost:3022');
const response = await fetch(`${tradingServiceUrl}/api/v1/admin/accounts/sync`);
if (!response.ok) {
throw new Error(`Failed to fetch accounts: ${response.statusText}`);
}
const responseData = await response.json();
const accounts = responseData.data?.accounts || responseData.accounts || [];
let syncedCount = 0;
for (const account of accounts) {
try {
await this.prisma.syncedTradingAccount.upsert({
where: { accountSequence: account.accountSequence },
create: {
accountSequence: account.accountSequence,
shareBalance: account.shareBalance || 0,
cashBalance: account.cashBalance || 0,
frozenShares: account.frozenShares || 0,
frozenCash: account.frozenCash || 0,
totalBought: account.totalBought || 0,
totalSold: account.totalSold || 0,
},
update: {
shareBalance: account.shareBalance,
cashBalance: account.cashBalance,
frozenShares: account.frozenShares,
frozenCash: account.frozenCash,
totalBought: account.totalBought,
totalSold: account.totalSold,
},
});
syncedCount++;
} catch (err) {
this.logger.warn(`Failed to sync trading account ${account.accountSequence}: ${err}`);
}
}
await this.prisma.auditLog.create({
data: { adminId, action: 'SYNC', resource: 'TRADING_ACCOUNT', newValue: { syncedCount } },
});
return { success: true, message: `Synced ${syncedCount} trading accounts`, syncedCount };
} catch (error: any) {
return { success: false, message: error.message };
}
}
}

View File

@ -7,55 +7,53 @@ export class SystemAccountsService {
/**
*
* CDC
*/
async getSystemAccounts() {
// 先从本地 SystemAccount 表获取
const localAccounts = await this.prisma.systemAccount.findMany({
// 从 CDC 同步的 SyncedWalletSystemAccount 表获取数据
const syncedAccounts = await this.prisma.syncedWalletSystemAccount.findMany({
orderBy: { accountType: 'asc' },
});
// 从 CDC 同步的 SyncedSystemContribution 获取算力数据
// 从 CDC 同步的 SyncedSystemContribution 获取算力数据
const syncedContributions =
await this.prisma.syncedSystemContribution.findMany();
// 合并数据
const accountsMap = new Map<string, any>();
// 构建算力数据映射
const contributionMap = new Map<string, any>();
for (const contrib of syncedContributions) {
contributionMap.set(contrib.accountType, contrib);
}
// 添加本地账户
for (const account of localAccounts) {
accountsMap.set(account.accountType, {
// 构建返回数据
const accounts = syncedAccounts.map((account) => {
const contrib = contributionMap.get(account.accountType);
return {
id: account.originalId,
accountType: account.accountType,
name: account.name,
description: account.description,
totalContribution: account.totalContribution.toString(),
createdAt: account.createdAt,
source: 'local',
});
}
// 更新或添加同步的算力数据
for (const contrib of syncedContributions) {
const existing = accountsMap.get(contrib.accountType);
if (existing) {
existing.contributionBalance = contrib.contributionBalance.toString();
existing.contributionNeverExpires = contrib.contributionNeverExpires;
existing.syncedAt = contrib.syncedAt;
existing.source = 'synced';
} else {
accountsMap.set(contrib.accountType, {
accountType: contrib.accountType,
name: contrib.name,
contributionBalance: contrib.contributionBalance.toString(),
contributionNeverExpires: contrib.contributionNeverExpires,
syncedAt: contrib.syncedAt,
source: 'synced',
});
}
}
code: account.code,
provinceId: account.provinceId,
cityId: account.cityId,
shareBalance: account.shareBalance.toString(),
usdtBalance: account.usdtBalance.toString(),
greenPointBalance: account.greenPointBalance.toString(),
frozenShare: account.frozenShare.toString(),
frozenUsdt: account.frozenUsdt.toString(),
totalInflow: account.totalInflow.toString(),
totalOutflow: account.totalOutflow.toString(),
blockchainAddress: account.blockchainAddress,
isActive: account.isActive,
contributionBalance: contrib?.contributionBalance?.toString() || '0',
contributionNeverExpires: contrib?.contributionNeverExpires || false,
syncedAt: account.syncedAt,
source: 'cdc',
};
});
return {
accounts: Array.from(accountsMap.values()),
total: accountsMap.size,
accounts,
total: accounts.length,
};
}
@ -63,22 +61,21 @@ export class SystemAccountsService {
*
*/
async getSystemAccountsSummary() {
const [localAccounts, syncedContributions, miningConfig, circulationPool] =
await Promise.all([
this.prisma.systemAccount.findMany(),
this.prisma.syncedSystemContribution.findMany(),
this.prisma.syncedMiningConfig.findFirst(),
this.prisma.syncedCirculationPool.findFirst(),
]);
const [
syncedSystemAccounts,
syncedPoolAccounts,
syncedContributions,
miningConfig,
circulationPool,
] = await Promise.all([
this.prisma.syncedWalletSystemAccount.findMany(),
this.prisma.syncedWalletPoolAccount.findMany(),
this.prisma.syncedSystemContribution.findMany(),
this.prisma.syncedMiningConfig.findFirst(),
this.prisma.syncedCirculationPool.findFirst(),
]);
// 计算总算力
let totalSystemContribution = 0n;
for (const account of localAccounts) {
totalSystemContribution += BigInt(
account.totalContribution.toString().replace('.', ''),
);
}
let totalSyncedContribution = 0n;
for (const contrib of syncedContributions) {
totalSyncedContribution += BigInt(
@ -88,11 +85,22 @@ export class SystemAccountsService {
return {
systemAccounts: {
count: localAccounts.length,
totalContribution: (
Number(totalSystemContribution) / 100000000
count: syncedSystemAccounts.length,
totalBalance: syncedSystemAccounts.reduce(
(sum, acc) => sum + Number(acc.shareBalance),
0,
).toFixed(8),
},
poolAccounts: {
count: syncedPoolAccounts.length,
pools: syncedPoolAccounts.map((pool) => ({
poolType: pool.poolType,
name: pool.name,
balance: pool.balance.toString(),
targetBurn: pool.targetBurn?.toString(),
remainingBurn: pool.remainingBurn?.toString(),
})),
},
syncedContributions: {
count: syncedContributions.length,
totalBalance: (Number(totalSyncedContribution) / 100000000).toFixed(8),

View File

@ -1,4 +1,5 @@
import { Injectable, NotFoundException } from '@nestjs/common';
import { Injectable, NotFoundException, Logger } from '@nestjs/common';
import { ConfigService } from '@nestjs/config';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
import { Prisma } from '@prisma/client';
@ -20,7 +21,15 @@ export interface GetOrdersQuery {
@Injectable()
export class UsersService {
constructor(private readonly prisma: PrismaService) {}
private readonly logger = new Logger(UsersService.name);
private readonly miningServiceUrl: string;
constructor(
private readonly prisma: PrismaService,
private readonly configService: ConfigService,
) {
this.miningServiceUrl = this.configService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
}
/**
*
@ -103,32 +112,38 @@ export class UsersService {
*/
private async getAdoptionStatsForUsers(
accountSequences: string[],
): Promise<Map<string, { personalCount: number; teamCount: number }>> {
): Promise<Map<string, { personalCount: number; personalOrders: number; teamCount: number; teamOrders: number }>> {
const result = new Map<
string,
{ personalCount: number; teamCount: number }
{ personalCount: number; personalOrders: number; teamCount: number; teamOrders: number }
>();
if (accountSequences.length === 0) return result;
// 获取每个用户的个人认种数量
// 获取每个用户的个人认种数量和订单数(只统计 MINING_ENABLED 状态)
const personalAdoptions = await this.prisma.syncedAdoption.groupBy({
by: ['accountSequence'],
where: { accountSequence: { in: accountSequences } },
where: {
accountSequence: { in: accountSequences },
status: 'MINING_ENABLED',
},
_sum: { treeCount: true },
_count: { id: true },
});
for (const stat of personalAdoptions) {
result.set(stat.accountSequence, {
personalCount: stat._sum.treeCount || 0,
personalOrders: stat._count.id || 0,
teamCount: 0,
teamOrders: 0,
});
}
// 确保所有用户都有记录
for (const seq of accountSequences) {
if (!result.has(seq)) {
result.set(seq, { personalCount: 0, teamCount: 0 });
result.set(seq, { personalCount: 0, personalOrders: 0, teamCount: 0, teamOrders: 0 });
}
}
@ -153,12 +168,15 @@ export class UsersService {
const teamAdoptionStats = await this.prisma.syncedAdoption.aggregate({
where: {
accountSequence: { in: teamMembers.map((m) => m.accountSequence) },
status: 'MINING_ENABLED',
},
_sum: { treeCount: true },
_count: { id: true },
});
const stats = result.get(ref.accountSequence);
if (stats) {
stats.teamCount = teamAdoptionStats._sum.treeCount || 0;
stats.teamOrders = teamAdoptionStats._count.id || 0;
}
}
}
@ -212,9 +230,9 @@ export class UsersService {
throw new NotFoundException(`用户 ${accountSequence} 不存在`);
}
// 获取个人认种数量(从 synced_adoptions 统计
// 获取个人认种数量(从 synced_adoptions 统计,只统计 MINING_ENABLED 状态
const personalAdoptionStats = await this.prisma.syncedAdoption.aggregate({
where: { accountSequence },
where: { accountSequence, status: 'MINING_ENABLED' },
_sum: { treeCount: true },
_count: { id: true },
});
@ -226,7 +244,7 @@ export class UsersService {
});
const directReferralCount = directReferrals.length;
// 获取直推认种数量
// 获取直推认种数量(只统计 MINING_ENABLED 状态)
let directReferralAdoptions = 0;
if (directReferrals.length > 0) {
const directAdoptionStats = await this.prisma.syncedAdoption.aggregate({
@ -234,6 +252,7 @@ export class UsersService {
accountSequence: {
in: directReferrals.map((r) => r.accountSequence),
},
status: 'MINING_ENABLED',
},
_sum: { treeCount: true },
});
@ -267,6 +286,7 @@ export class UsersService {
accountSequence: {
in: teamMembers.map((m) => m.accountSequence),
},
status: 'MINING_ENABLED',
},
_sum: { treeCount: true },
});
@ -412,8 +432,7 @@ export class UsersService {
}
/**
*
* mining-service
* mining-service
*/
async getUserMiningRecords(
accountSequence: string,
@ -430,33 +449,79 @@ export class UsersService {
}
const mining = user.miningAccount;
if (!mining) {
const emptySummary = {
accountSequence,
totalMined: '0',
availableBalance: '0',
frozenBalance: '0',
totalContribution: '0',
};
// 从 mining-service 获取挖矿记录
try {
const url = `${this.miningServiceUrl}/api/v2/mining/accounts/${accountSequence}/records?page=${page}&pageSize=${pageSize}`;
this.logger.log(`Fetching mining records from ${url}`);
const response = await fetch(url);
if (!response.ok) {
this.logger.warn(`Failed to fetch mining records: ${response.status}`);
return {
summary: mining ? {
accountSequence,
totalMined: mining.totalMined.toString(),
availableBalance: mining.availableBalance.toString(),
frozenBalance: mining.frozenBalance.toString(),
totalContribution: mining.totalContribution.toString(),
} : emptySummary,
records: [],
pagination: { page, pageSize, total: 0, totalPages: 0 },
};
}
const result = await response.json();
const recordsData = result.data || result;
// 格式化记录以匹配前端期望的格式
const records = (recordsData.data || []).map((r: any) => ({
id: r.id,
accountSequence,
distributionMinute: r.miningMinute,
contributionRatio: r.contributionRatio,
shareAmount: r.minedAmount,
priceSnapshot: r.secondDistribution,
createdAt: r.createdAt,
}));
return {
summary: {
summary: mining ? {
accountSequence,
totalMined: '0',
availableBalance: '0',
frozenBalance: '0',
totalContribution: '0',
totalMined: mining.totalMined.toString(),
availableBalance: mining.availableBalance.toString(),
frozenBalance: mining.frozenBalance.toString(),
totalContribution: mining.totalContribution.toString(),
} : emptySummary,
records,
pagination: {
page,
pageSize,
total: recordsData.total || 0,
totalPages: Math.ceil((recordsData.total || 0) / pageSize),
},
};
} catch (error) {
this.logger.error('Failed to fetch mining records from mining-service', error);
return {
summary: mining ? {
accountSequence,
totalMined: mining.totalMined.toString(),
availableBalance: mining.availableBalance.toString(),
frozenBalance: mining.frozenBalance.toString(),
totalContribution: mining.totalContribution.toString(),
} : emptySummary,
records: [],
pagination: { page, pageSize, total: 0, totalPages: 0 },
};
}
return {
summary: {
accountSequence,
totalMined: mining.totalMined.toString(),
availableBalance: mining.availableBalance.toString(),
frozenBalance: mining.frozenBalance.toString(),
totalContribution: mining.totalContribution.toString(),
},
// 详细流水需要从 mining-service 获取
records: [],
pagination: { page, pageSize, total: 0, totalPages: 0 },
note: '详细挖矿记录请查看 mining-service',
};
}
/**
@ -568,14 +633,14 @@ export class UsersService {
}
/**
*
* MINING_ENABLED
*/
private async getUserAdoptionStats(
accountSequence: string,
): Promise<{ personal: number; team: number }> {
// 个人认种
// 个人认种(只统计 MINING_ENABLED 状态)
const personalStats = await this.prisma.syncedAdoption.aggregate({
where: { accountSequence },
where: { accountSequence, status: 'MINING_ENABLED' },
_sum: { treeCount: true },
});
@ -587,7 +652,7 @@ export class UsersService {
let teamCount = 0;
if (referral?.originalUserId) {
// 团队认种 = 所有下级的认种总和
// 团队认种 = 所有下级的认种总和(只统计 MINING_ENABLED 状态)
const teamMembers = await this.prisma.syncedReferral.findMany({
where: {
ancestorPath: { contains: referral.originalUserId.toString() },
@ -599,6 +664,7 @@ export class UsersService {
const teamStats = await this.prisma.syncedAdoption.aggregate({
where: {
accountSequence: { in: teamMembers.map((m) => m.accountSequence) },
status: 'MINING_ENABLED',
},
_sum: { treeCount: true },
});
@ -840,7 +906,7 @@ export class UsersService {
/**
*
* TODO: mining-service
* SyncedUserWallet SyncedMiningAccount
*/
async getWalletLedger(accountSequence: string, page: number, pageSize: number) {
const user = await this.prisma.syncedUser.findUnique({
@ -852,20 +918,44 @@ export class UsersService {
throw new NotFoundException(`用户 ${accountSequence} 不存在`);
}
// 获取用户的各类钱包数据
const wallets = await this.prisma.syncedUserWallet.findMany({
where: { accountSequence },
});
// 按钱包类型分类
const walletByType = new Map(wallets.map(w => [w.walletType, w]));
const greenPointsWallet = walletByType.get('GREEN_POINTS');
const contributionWallet = walletByType.get('CONTRIBUTION');
const tokenWallet = walletByType.get('TOKEN_STORAGE');
const mining = user.miningAccount;
// 构建前端期望的钱包汇总格式
// usdtAvailable = GREEN_POINTS 钱包的可用余额 (绿积分)
// usdtFrozen = GREEN_POINTS 钱包的冻结余额
// pendingUsdt = 待领取收益(挖矿余额)
// settleableUsdt = 可结算收益
// settledTotalUsdt = 已结算收益
// expiredTotalUsdt = 过期收益
const summary = {
usdtAvailable: greenPointsWallet?.balance?.toString() || '0',
usdtFrozen: greenPointsWallet?.frozenBalance?.toString() || '0',
pendingUsdt: mining?.availableBalance?.toString() || '0', // 挖矿可用余额作为待领取
settleableUsdt: '0', // 暂无数据源
settledTotalUsdt: greenPointsWallet?.totalInflow?.toString() || '0', // 总流入作为已结算
expiredTotalUsdt: '0', // 暂无数据源
};
// TODO: 实现钱包流水分页查询
// 目前从 SyncedUserWallet 只能获取汇总数据,流水明细需要额外的表
return {
summary: {
availableBalance: mining?.availableBalance?.toString() || '0',
frozenBalance: mining?.frozenBalance?.toString() || '0',
totalMined: mining?.totalMined?.toString() || '0',
},
summary,
items: [],
total: 0,
page,
pageSize,
totalPages: 0,
note: '钱包流水数据需要从 mining-service 同步',
};
}
@ -876,7 +966,7 @@ export class UsersService {
private formatUserListItem(
user: any,
extra?: {
adoptionStats?: { personalCount: number; teamCount: number };
adoptionStats?: { personalCount: number; personalOrders: number; teamCount: number; teamOrders: number };
referrerInfo?: { nickname: string | null; phone: string } | null;
},
) {
@ -892,7 +982,9 @@ export class UsersService {
// 认种统计
adoption: {
personalAdoptionCount: extra?.adoptionStats?.personalCount || 0,
personalAdoptionOrders: extra?.adoptionStats?.personalOrders || 0,
teamAdoptions: extra?.adoptionStats?.teamCount || 0,
teamAdoptionOrders: extra?.adoptionStats?.teamOrders || 0,
},
// 推荐人信息
referral: user.referral

View File

@ -353,6 +353,12 @@ export class CdcSyncService implements OnModuleInit {
this.withIdempotency(this.walletHandlers.handleFeeConfigUpdated.bind(this.walletHandlers)),
);
// CONTRIBUTION_CREDITED 事件 - 贡献值入账时更新用户钱包
this.cdcConsumer.registerServiceHandler(
'CONTRIBUTION_CREDITED',
this.withIdempotency(this.handleContributionCredited.bind(this)),
);
this.logger.log('CDC sync handlers registered with idempotency protection');
}
@ -813,4 +819,60 @@ export class CdcSyncService implements OnModuleInit {
this.logger.debug('Synced circulation pool');
}
// ===========================================================================
// 钱包事件处理 (mining-wallet-service)
// ===========================================================================
/**
* CONTRIBUTION_CREDITED
* mining-wallet-service
* payload: { accountSequence, walletType, amount, balanceAfter, transactionId, ... }
*/
private async handleContributionCredited(event: ServiceEvent, tx: TransactionClient): Promise<void> {
const { payload } = event;
const walletType = payload.walletType || 'CONTRIBUTION';
// 先查找是否已存在
const existing = await tx.syncedUserWallet.findUnique({
where: {
accountSequence_walletType: {
accountSequence: payload.accountSequence,
walletType,
},
},
});
if (existing) {
// 更新余额(使用最新的 balanceAfter
await tx.syncedUserWallet.update({
where: { id: existing.id },
data: {
balance: payload.balanceAfter,
totalInflow: {
increment: parseFloat(payload.amount) || 0,
},
},
});
} else {
// 创建新钱包记录
// originalId 使用 accountSequence + walletType 的组合生成一个稳定的 ID
const originalId = `wallet-${payload.accountSequence}-${walletType}`;
await tx.syncedUserWallet.create({
data: {
originalId,
accountSequence: payload.accountSequence,
walletType,
balance: payload.balanceAfter || 0,
frozenBalance: 0,
totalInflow: parseFloat(payload.amount) || 0,
totalOutflow: 0,
isActive: true,
},
});
}
this.logger.debug(`Synced user wallet from CONTRIBUTION_CREDITED: ${payload.accountSequence}, balance: ${payload.balanceAfter}`);
}
}

View File

@ -14,7 +14,7 @@ RUN npm ci
RUN DATABASE_URL="postgresql://user:pass@localhost:5432/db" npx prisma generate
COPY src ./src
RUN npm run build
RUN npm run build && ls -la dist/ && test -f dist/main.js && echo "Build successful: dist/main.js exists"
# 阶段2: 生产运行
FROM node:20-alpine AS runner
@ -30,14 +30,16 @@ WORKDIR /app
USER nestjs
COPY --chown=nestjs:nodejs package*.json ./
RUN npm ci --only=production && npm cache clean --force
COPY --chown=nestjs:nodejs tsconfig*.json ./
RUN npm ci --only=production && npm install ts-node typescript @types/node --save-dev && npm cache clean --force
COPY --chown=nestjs:nodejs prisma ./prisma/
RUN DATABASE_URL="postgresql://user:pass@localhost:5432/db" npx prisma generate
COPY --chown=nestjs:nodejs --from=builder /app/dist ./dist
RUN ls -la dist/ && test -f dist/main.js && echo "Copy successful: dist/main.js exists"
RUN printf '#!/bin/sh\nset -e\necho "Running database migrations..."\nnpx prisma migrate deploy\necho "Starting application..."\nexec node dist/main.js\n' > /app/start.sh && chmod +x /app/start.sh
RUN printf '#!/bin/sh\nset -e\necho "Running database migrations..."\nnpx prisma migrate deploy\necho "Running database seed..."\nnpx prisma db seed || echo "Seed skipped or already applied"\necho "Starting application..."\nexec node dist/main.js\n' > /app/start.sh && chmod +x /app/start.sh
ENV NODE_ENV=production
ENV TZ=Asia/Shanghai

View File

@ -16,7 +16,8 @@
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate dev",
"prisma:migrate:prod": "prisma migrate deploy",
"prisma:studio": "prisma studio"
"prisma:studio": "prisma studio",
"prisma:seed": "ts-node prisma/seed.ts"
},
"dependencies": {
"@nestjs/common": "^10.3.0",
@ -37,6 +38,9 @@
"rxjs": "^7.8.1",
"swagger-ui-express": "^5.0.0"
},
"prisma": {
"seed": "ts-node prisma/seed.ts"
},
"devDependencies": {
"@nestjs/cli": "^10.2.1",
"@nestjs/schematics": "^10.0.3",

View File

@ -0,0 +1,13 @@
-- ============================================================================
-- 将 minuteDistribution 改为 secondDistribution
-- 支持每秒挖矿分配
-- ============================================================================
-- 重命名 mining_configs 表的列
ALTER TABLE "mining_configs" RENAME COLUMN "minuteDistribution" TO "secondDistribution";
-- 重命名 mining_eras 表的列
ALTER TABLE "mining_eras" RENAME COLUMN "minuteDistribution" TO "secondDistribution";
-- 重命名 mining_records 表的列
ALTER TABLE "mining_records" RENAME COLUMN "minuteDistribution" TO "secondDistribution";

View File

@ -18,7 +18,7 @@ model MiningConfig {
halvingPeriodYears Int @default(2) // 减半周期(年)
currentEra Int @default(1) // 当前纪元
eraStartDate DateTime // 当前纪元开始日期
minuteDistribution Decimal @db.Decimal(30, 18) // 每分钟分配量
secondDistribution Decimal @db.Decimal(30, 18) // 每秒分配量
isActive Boolean @default(false) // 是否已激活挖矿
activatedAt DateTime? // 激活时间
createdAt DateTime @default(now())
@ -35,7 +35,7 @@ model MiningEra {
endDate DateTime?
initialDistribution Decimal @db.Decimal(30, 8) // 纪元初始可分配量
totalDistributed Decimal @default(0) @db.Decimal(30, 8) // 已分配量
minuteDistribution Decimal @db.Decimal(30, 18) // 每分钟分配量
secondDistribution Decimal @db.Decimal(30, 18) // 每秒分配量
isActive Boolean @default(true)
createdAt DateTime @default(now())
@ -63,15 +63,16 @@ model MiningAccount {
@@map("mining_accounts")
}
// 挖矿记录(分钟级别)
// 挖矿记录(分钟级别汇总)
// 每秒更新余额,每分钟写入一条汇总记录
model MiningRecord {
id String @id @default(uuid())
accountSequence String
miningMinute DateTime // 挖矿分钟(精确到分钟)
miningMinute DateTime // 挖矿时间(精确到分钟)
contributionRatio Decimal @db.Decimal(30, 18) // 当时的算力占比
totalContribution Decimal @db.Decimal(30, 8) // 当时的总算力
minuteDistribution Decimal @db.Decimal(30, 18) // 当分钟总分配量
minedAmount Decimal @db.Decimal(30, 18) // 挖到的数量
secondDistribution Decimal @db.Decimal(30, 18) // 每秒分配量
minedAmount Decimal @db.Decimal(30, 18) // 该分钟挖到的数量
createdAt DateTime @default(now())
account MiningAccount @relation(fields: [accountSequence], references: [accountSequence])

View File

@ -0,0 +1,119 @@
import { PrismaClient } from '@prisma/client';
import Decimal from 'decimal.js';
const prisma = new PrismaClient();
/**
* Mining Service
*
* :
* - 100.02 亿
* - 200
* - 100 50
* - 100 亿10
* -
*/
async function main() {
console.log('🚀 Mining-service seed starting...\n');
const now = new Date();
// 常量
const TOTAL_SHARES = new Decimal('100020000000'); // 100.02B
const DISTRIBUTION_POOL = new Decimal('2000000'); // 200万
const ERA1_DISTRIBUTION = new Decimal('1000000'); // 100万第一个两年
const BURN_TARGET = new Decimal('10000000000'); // 100亿
// 每秒分配量计算: 100万 / (2年 * 365天 * 24小时 * 60分钟 * 60秒)
const SECONDS_IN_2_YEARS = 2 * 365 * 24 * 60 * 60; // 63,072,000秒
const SECOND_DISTRIBUTION = ERA1_DISTRIBUTION.dividedBy(SECONDS_IN_2_YEARS);
// 1. MiningConfig - 挖矿配置(不激活,等待管理员手动启动)
await prisma.miningConfig.upsert({
where: { id: 'default' },
create: {
id: 'default',
totalShares: TOTAL_SHARES,
distributionPool: DISTRIBUTION_POOL,
remainingDistribution: ERA1_DISTRIBUTION,
halvingPeriodYears: 2,
currentEra: 1,
eraStartDate: now,
secondDistribution: SECOND_DISTRIBUTION,
isActive: false, // 等待管理员在后台启动
activatedAt: null,
},
update: {},
});
console.log('✅ MiningConfig initialized (inactive, waiting for admin activation)');
// 2. BlackHole - 黑洞账户
await prisma.blackHole.upsert({
where: { id: 'default' },
create: {
id: 'default',
totalBurned: 0,
targetBurn: BURN_TARGET,
remainingBurn: BURN_TARGET,
},
update: {},
});
console.log('✅ BlackHole initialized');
// 3. MiningEra - 第一纪元
await prisma.miningEra.upsert({
where: { eraNumber: 1 },
create: {
eraNumber: 1,
startDate: now,
initialDistribution: ERA1_DISTRIBUTION,
totalDistributed: 0,
secondDistribution: SECOND_DISTRIBUTION,
isActive: true,
},
update: {},
});
console.log('✅ MiningEra 1 initialized');
// 4. PoolAccounts - 池账户
const pools = [
{ poolType: 'SHARE_POOL', name: '积分股池', balance: TOTAL_SHARES },
{ poolType: 'BLACK_HOLE_POOL', name: '黑洞池', balance: new Decimal(0) },
{ poolType: 'CIRCULATION_POOL', name: '流通池', balance: new Decimal(0) },
];
for (const pool of pools) {
await prisma.poolAccount.upsert({
where: { poolType: pool.poolType as any },
create: {
poolType: pool.poolType as any,
name: pool.name,
balance: pool.balance,
totalInflow: pool.balance,
totalOutflow: 0,
isActive: true,
},
update: {},
});
}
console.log('✅ PoolAccounts initialized');
// 输出配置
console.log('\n📊 Configuration:');
console.log(` Total Shares: ${TOTAL_SHARES.toFixed(0)} (100.02B)`);
console.log(` Distribution Pool: ${DISTRIBUTION_POOL.toFixed(0)} (200万)`);
console.log(` Era 1 Distribution: ${ERA1_DISTRIBUTION.toFixed(0)} (100万)`);
console.log(` Seconds in 2 Years: ${SECONDS_IN_2_YEARS}`);
console.log(` Second Distribution: ${SECOND_DISTRIBUTION.toFixed(12)}`);
console.log(` Burn Target: ${BURN_TARGET.toFixed(0)} (100亿, 10年完成)`);
console.log(` Mining Active: false (需要在管理后台手动启动)`);
console.log('\n🎉 Mining-service seed completed!');
}
main()
.catch((e) => {
console.error('❌ Seed failed:', e);
process.exit(1);
})
.finally(() => prisma.$disconnect());

View File

@ -1,4 +1,4 @@
import { Controller, Get } from '@nestjs/common';
import { Controller, Get, Post, HttpException, HttpStatus } from '@nestjs/common';
import { ApiTags, ApiOperation } from '@nestjs/swagger';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
import { Public } from '../../shared/guards/jwt-auth.guard';
@ -37,4 +37,83 @@ export class AdminController {
total: accounts.length,
};
}
@Get('status')
@Public()
@ApiOperation({ summary: '获取挖矿系统状态' })
async getStatus() {
const config = await this.prisma.miningConfig.findFirst();
const blackHole = await this.prisma.blackHole.findFirst();
const accountCount = await this.prisma.miningAccount.count();
const totalContribution = await this.prisma.miningAccount.aggregate({
_sum: { totalContribution: true },
});
return {
initialized: !!config,
isActive: config?.isActive || false,
activatedAt: config?.activatedAt,
currentEra: config?.currentEra || 0,
remainingDistribution: config?.remainingDistribution?.toString() || '0',
secondDistribution: config?.secondDistribution?.toString() || '0',
blackHole: blackHole
? {
totalBurned: blackHole.totalBurned.toString(),
targetBurn: blackHole.targetBurn.toString(),
remainingBurn: blackHole.remainingBurn.toString(),
}
: null,
accountCount,
totalContribution: totalContribution._sum.totalContribution?.toString() || '0',
};
}
@Post('activate')
@Public()
@ApiOperation({ summary: '激活挖矿系统' })
async activate() {
const config = await this.prisma.miningConfig.findFirst();
if (!config) {
throw new HttpException('挖矿系统未初始化,请先运行 seed 脚本', HttpStatus.BAD_REQUEST);
}
if (config.isActive) {
return { success: true, message: '挖矿系统已经处于激活状态' };
}
await this.prisma.miningConfig.update({
where: { id: config.id },
data: {
isActive: true,
activatedAt: new Date(),
},
});
return { success: true, message: '挖矿系统已激活' };
}
@Post('deactivate')
@Public()
@ApiOperation({ summary: '停用挖矿系统' })
async deactivate() {
const config = await this.prisma.miningConfig.findFirst();
if (!config) {
throw new HttpException('挖矿系统未初始化', HttpStatus.BAD_REQUEST);
}
if (!config.isActive) {
return { success: true, message: '挖矿系统已经处于停用状态' };
}
await this.prisma.miningConfig.update({
where: { id: config.id },
data: {
isActive: false,
},
});
return { success: true, message: '挖矿系统已停用' };
}
}

View File

@ -2,9 +2,11 @@ import { Controller, Get } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiResponse } from '@nestjs/swagger';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
import { RedisService } from '../../infrastructure/redis/redis.service';
import { Public } from '../../shared/guards/jwt-auth.guard';
@ApiTags('Health')
@Controller('health')
@Public()
export class HealthController {
constructor(
private readonly prisma: PrismaService,

View File

@ -2,9 +2,11 @@ import { Controller, Get, Param, Query, NotFoundException } from '@nestjs/common
import { ApiTags, ApiOperation, ApiResponse, ApiParam, ApiQuery } from '@nestjs/swagger';
import { GetMiningAccountQuery } from '../../application/queries/get-mining-account.query';
import { GetMiningStatsQuery } from '../../application/queries/get-mining-stats.query';
import { Public } from '../../shared/guards/jwt-auth.guard';
@ApiTags('Mining')
@Controller('mining')
@Public() // 服务间调用,不需要认证
export class MiningController {
constructor(
private readonly getAccountQuery: GetMiningAccountQuery,

View File

@ -1,43 +1,80 @@
import { Injectable, Logger } from '@nestjs/common';
import { EventPattern, Payload } from '@nestjs/microservices';
import { Injectable, Logger, OnModuleInit } from '@nestjs/common';
import { ConfigService } from '@nestjs/config';
import { ContributionSyncService } from '../services/contribution-sync.service';
import { Kafka, Consumer, EachMessagePayload } from 'kafkajs';
@Injectable()
export class ContributionEventHandler {
export class ContributionEventHandler implements OnModuleInit {
private readonly logger = new Logger(ContributionEventHandler.name);
private consumer: Consumer;
constructor(private readonly syncService: ContributionSyncService) {}
constructor(
private readonly syncService: ContributionSyncService,
private readonly configService: ConfigService,
) {}
async onModuleInit() {
const kafkaBrokers = this.configService.get<string>('KAFKA_BROKERS', 'localhost:9092');
const topic = this.configService.get<string>('CDC_TOPIC_CONTRIBUTION_OUTBOX', 'cdc.contribution.outbox');
const kafka = new Kafka({
clientId: 'mining-service',
brokers: kafkaBrokers.split(','),
});
this.consumer = kafka.consumer({ groupId: 'mining-service-contribution-sync' });
@EventPattern('contribution.ContributionCalculated')
async handleContributionCalculated(@Payload() message: any): Promise<void> {
try {
const { payload } = message.value || message;
this.logger.debug(`Received ContributionCalculated event for ${payload.accountSequence}`);
await this.consumer.connect();
await this.consumer.subscribe({ topic, fromBeginning: false });
await this.syncService.handleContributionCalculated({
accountSequence: payload.accountSequence,
personalContribution: payload.personalContribution,
calculatedAt: payload.calculatedAt,
await this.consumer.run({
eachMessage: async (payload: EachMessagePayload) => {
await this.handleMessage(payload);
},
});
this.logger.log(`Subscribed to ${topic} for contribution sync`);
} catch (error) {
this.logger.error('Failed to handle ContributionCalculated event', error);
this.logger.error('Failed to connect to Kafka for contribution sync', error);
}
}
@EventPattern('contribution.DailySnapshotCreated')
async handleDailySnapshotCreated(@Payload() message: any): Promise<void> {
private async handleMessage(payload: EachMessagePayload): Promise<void> {
try {
const { payload } = message.value || message;
this.logger.log(`Received DailySnapshotCreated event for ${payload.snapshotDate}`);
const { message } = payload;
if (!message.value) return;
await this.syncService.handleDailySnapshotCreated({
snapshotId: payload.snapshotId,
snapshotDate: payload.snapshotDate,
totalContribution: payload.totalContribution,
activeAccounts: payload.activeAccounts,
});
const event = JSON.parse(message.value.toString());
// CDC 消息格式:{ after: { event_type, payload, ... } }
const data = event.after || event;
const eventType = data.event_type || data.eventType;
const eventPayload = typeof data.payload === 'string' ? JSON.parse(data.payload) : data.payload;
if (!eventPayload) return;
if (eventType === 'ContributionAccountUpdated') {
this.logger.debug(`Received ContributionAccountUpdated for ${eventPayload.accountSequence}`);
// 使用 effectiveContribution 作为挖矿算力
await this.syncService.handleContributionCalculated({
accountSequence: eventPayload.accountSequence,
personalContribution: eventPayload.effectiveContribution || eventPayload.totalContribution || '0',
calculatedAt: eventPayload.createdAt || new Date().toISOString(),
});
} else if (eventType === 'DailySnapshotCreated') {
this.logger.log(`Received DailySnapshotCreated for ${eventPayload.snapshotDate}`);
await this.syncService.handleDailySnapshotCreated({
snapshotId: eventPayload.snapshotId,
snapshotDate: eventPayload.snapshotDate,
totalContribution: eventPayload.totalContribution,
activeAccounts: eventPayload.activeAccounts,
});
}
} catch (error) {
this.logger.error('Failed to handle DailySnapshotCreated event', error);
this.logger.error('Failed to handle contribution event', error);
}
}
}

View File

@ -17,7 +17,7 @@ export interface MiningRecordDto {
miningMinute: Date;
contributionRatio: string;
totalContribution: string;
minuteDistribution: string;
secondDistribution: string;
minedAmount: string;
createdAt: Date;
}
@ -79,7 +79,7 @@ export class GetMiningAccountQuery {
miningMinute: r.miningMinute,
contributionRatio: r.contributionRatio.toString(),
totalContribution: r.totalContribution.toString(),
minuteDistribution: r.minuteDistribution.toString(),
secondDistribution: r.secondDistribution.toString(),
minedAmount: r.minedAmount.toString(),
createdAt: r.createdAt,
})),

View File

@ -16,7 +16,7 @@ export interface MiningStatsDto {
totalShares: string;
distributionPool: string;
remainingDistribution: string;
minuteDistribution: string;
secondDistribution: string;
// 参与信息
totalContribution: string;
@ -79,7 +79,7 @@ export class GetMiningStatsQuery {
totalShares: config?.totalShares.toString() || '0',
distributionPool: config?.distributionPool.toString() || '0',
remainingDistribution: config?.remainingDistribution.toString() || '0',
minuteDistribution: config?.minuteDistribution.toString() || '0',
secondDistribution: config?.secondDistribution.toString() || '0',
totalContribution: totalContribution.toString(),
participantCount,
totalMined: totalMined.toString(),

View File

@ -19,14 +19,14 @@ export class MiningScheduler implements OnModuleInit {
}
/**
*
*
*/
@Cron(CronExpression.EVERY_MINUTE)
async executeMinuteDistribution(): Promise<void> {
@Cron(CronExpression.EVERY_SECOND)
async executeSecondDistribution(): Promise<void> {
try {
await this.distributionService.executeMinuteDistribution();
await this.distributionService.executeSecondDistribution();
} catch (error) {
this.logger.error('Failed to execute minute distribution', error);
this.logger.error('Failed to execute second distribution', error);
}
}

View File

@ -7,19 +7,23 @@ import { PriceSnapshotRepository } from '../../infrastructure/persistence/reposi
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
import { RedisService } from '../../infrastructure/redis/redis.service';
import { MiningCalculatorService } from '../../domain/services/mining-calculator.service';
import { MiningAccountAggregate } from '../../domain/aggregates/mining-account.aggregate';
import { ShareAmount } from '../../domain/value-objects/share-amount.vo';
import { Price } from '../../domain/value-objects/price.vo';
import Decimal from 'decimal.js';
/**
*
*
*
*
*
* -
* - MiningRecord记录
*/
@Injectable()
export class MiningDistributionService {
private readonly logger = new Logger(MiningDistributionService.name);
private readonly calculator = new MiningCalculatorService();
private readonly LOCK_KEY = 'mining:distribution:lock';
private readonly MINUTE_ACCUMULATOR_PREFIX = 'mining:minute:accumulator:';
constructor(
private readonly miningAccountRepository: MiningAccountRepository,
@ -32,52 +36,43 @@ export class MiningDistributionService {
) {}
/**
*
*
* -
* - MiningRecord
*/
async executeMinuteDistribution(): Promise<void> {
// 获取分布式锁
const lockValue = await this.redis.acquireLock(this.LOCK_KEY, 55);
async executeSecondDistribution(): Promise<void> {
// 获取分布式锁锁定时间900ms
const lockValue = await this.redis.acquireLock(this.LOCK_KEY, 0.9);
if (!lockValue) {
this.logger.debug('Another instance is processing distribution');
return;
}
try {
const config = await this.miningConfigRepository.getConfig();
if (!config || !config.isActive) {
this.logger.debug('Mining is not active');
return;
}
const currentSecond = this.getCurrentSecond();
const currentMinute = this.getCurrentMinute();
const isMinuteEnd = currentSecond.getSeconds() === 59;
// 检查是否已处理过这一分钟
const processedKey = `mining:processed:${currentMinute.toISOString()}`;
// 检查是否已处理过这一
const processedKey = `mining:processed:${currentSecond.getTime()}`;
if (await this.redis.get(processedKey)) {
return;
}
// 计算每分钟分配量
const remainingMinutes = this.calculator.calculateRemainingMinutes(
config.eraStartDate,
MiningCalculatorService.HALVING_PERIOD_MINUTES,
);
// 使用预计算的每秒分配量
const secondDistribution = config.secondDistribution;
const minuteDistribution = this.calculator.calculateMinuteDistribution(
config.remainingDistribution,
config.currentEra,
remainingMinutes,
);
if (minuteDistribution.isZero()) {
this.logger.debug('No distribution available');
if (secondDistribution.isZero()) {
return;
}
// 获取有算力的账户
const totalContribution = await this.miningAccountRepository.getTotalContribution();
if (totalContribution.isZero()) {
this.logger.debug('No contribution available');
return;
}
@ -95,24 +90,23 @@ export class MiningDistributionService {
const reward = this.calculator.calculateUserMiningReward(
account.totalContribution,
totalContribution,
minuteDistribution,
secondDistribution,
);
if (!reward.isZero()) {
account.mine(reward, `分钟挖矿 ${currentMinute.toISOString()}`);
// 每秒更新账户余额
account.mine(reward, `秒挖矿 ${currentSecond.getTime()}`);
await this.miningAccountRepository.save(account);
// 保存挖矿记录
await this.prisma.miningRecord.create({
data: {
accountSequence: account.accountSequence,
miningMinute: currentMinute,
contributionRatio: account.totalContribution.value.dividedBy(totalContribution.value),
totalContribution: totalContribution.value,
minuteDistribution: minuteDistribution.value,
minedAmount: reward.value,
},
});
// 累积每分钟的挖矿数据到Redis
await this.accumulateMinuteData(
account.accountSequence,
currentMinute,
reward,
account.totalContribution,
totalContribution,
secondDistribution,
);
totalDistributed = totalDistributed.add(reward);
participantCount++;
@ -123,46 +117,120 @@ export class MiningDistributionService {
page++;
}
// 每分钟结束时写入汇总的MiningRecord
if (isMinuteEnd) {
await this.writeMinuteRecords(currentMinute);
}
// 执行销毁
const burnAmount = await this.executeBurn(currentMinute);
const burnAmount = await this.executeBurn(currentSecond);
// 更新配置
const newRemaining = config.remainingDistribution.subtract(totalDistributed);
await this.miningConfigRepository.updateRemainingDistribution(newRemaining);
// 保存分钟统计
await this.prisma.minuteMiningStat.create({
data: {
minute: currentMinute,
totalContribution: totalContribution.value,
totalDistributed: totalDistributed.value,
participantCount,
burnAmount: burnAmount.value,
},
});
// 标记已处理过期时间2秒
await this.redis.set(processedKey, '1', 2);
// 保存价格快照
await this.savePriceSnapshot(currentMinute);
// 标记已处理
await this.redis.set(processedKey, '1', 120);
this.logger.log(
`Minute distribution completed: distributed=${totalDistributed.toFixed(8)}, ` +
`participants=${participantCount}, burned=${burnAmount.toFixed(8)}`,
);
// 每分钟记录一次日志
if (isMinuteEnd) {
this.logger.log(
`Minute distribution: distributed=${totalDistributed.toFixed(8)}, participants=${participantCount}`,
);
}
} catch (error) {
this.logger.error('Failed to execute minute distribution', error);
throw error;
this.logger.error('Failed to execute second distribution', error);
} finally {
await this.redis.releaseLock(this.LOCK_KEY, lockValue);
}
}
/**
* Redis
*/
private async accumulateMinuteData(
accountSequence: string,
minuteTime: Date,
reward: ShareAmount,
accountContribution: ShareAmount,
totalContribution: ShareAmount,
secondDistribution: ShareAmount,
): Promise<void> {
const key = `${this.MINUTE_ACCUMULATOR_PREFIX}${minuteTime.getTime()}:${accountSequence}`;
const existing = await this.redis.get(key);
let accumulated: {
minedAmount: string;
contributionRatio: string;
totalContribution: string;
secondDistribution: string;
secondCount: number;
};
if (existing) {
accumulated = JSON.parse(existing);
accumulated.minedAmount = new Decimal(accumulated.minedAmount).plus(reward.value).toString();
accumulated.secondCount += 1;
// 更新为最新的贡献比例
accumulated.contributionRatio = accountContribution.value.dividedBy(totalContribution.value).toString();
accumulated.totalContribution = totalContribution.value.toString();
accumulated.secondDistribution = secondDistribution.value.toString();
} else {
accumulated = {
minedAmount: reward.value.toString(),
contributionRatio: accountContribution.value.dividedBy(totalContribution.value).toString(),
totalContribution: totalContribution.value.toString(),
secondDistribution: secondDistribution.value.toString(),
secondCount: 1,
};
}
// 设置过期时间为2分钟确保即使处理失败也能清理
await this.redis.set(key, JSON.stringify(accumulated), 120);
}
/**
* MiningRecord
*/
private async writeMinuteRecords(minuteTime: Date): Promise<void> {
try {
// 获取所有该分钟的累积数据
const pattern = `${this.MINUTE_ACCUMULATOR_PREFIX}${minuteTime.getTime()}:*`;
const keys = await this.redis.keys(pattern);
for (const key of keys) {
const data = await this.redis.get(key);
if (!data) continue;
const accumulated = JSON.parse(data);
const accountSequence = key.split(':').pop();
if (!accountSequence) continue;
// 写入汇总的MiningRecord
await this.prisma.miningRecord.create({
data: {
accountSequence,
miningMinute: minuteTime,
contributionRatio: new Decimal(accumulated.contributionRatio),
totalContribution: new Decimal(accumulated.totalContribution),
secondDistribution: new Decimal(accumulated.secondDistribution),
minedAmount: new Decimal(accumulated.minedAmount),
},
});
// 删除已处理的累积数据
await this.redis.del(key);
}
} catch (error) {
this.logger.error('Failed to write minute records', error);
}
}
/**
*
*/
private async executeBurn(burnMinute: Date): Promise<ShareAmount> {
private async executeBurn(burnSecond: Date): Promise<ShareAmount> {
const blackHole = await this.blackHoleRepository.getBlackHole();
if (!blackHole) {
return ShareAmount.zero();
@ -177,59 +245,37 @@ export class MiningDistributionService {
return ShareAmount.zero();
}
// 计算剩余销毁分钟数(使用整个挖矿周期
const totalBurnMinutes = 10 * 365 * 24 * 60; // 10年
const remainingMinutes = this.calculator.calculateRemainingBurnMinutes(
// 计算剩余销毁秒数10年
const totalBurnSeconds = 10 * 365 * 24 * 60 * 60;
const remainingSeconds = this.calculator.calculateRemainingSeconds(
config.activatedAt || new Date(),
totalBurnMinutes,
totalBurnSeconds,
);
const burnAmount = this.calculator.calculateMinuteBurn(
const burnAmount = this.calculator.calculateSecondBurn(
blackHole.targetBurn,
blackHole.totalBurned,
remainingMinutes,
remainingSeconds,
);
if (!burnAmount.isZero()) {
await this.blackHoleRepository.recordBurn(burnMinute, burnAmount);
await this.blackHoleRepository.recordBurn(burnSecond, burnAmount);
}
return burnAmount;
}
/**
*
*
*/
private async savePriceSnapshot(snapshotTime: Date): Promise<void> {
const blackHole = await this.blackHoleRepository.getBlackHole();
// 获取流通池数据(需要从 trading-service 获取,这里简化处理)
const circulationPool = ShareAmount.zero(); // TODO: 从 trading-service 获取
// 获取股池数据(初始为分配池,实际需要计算)
const config = await this.miningConfigRepository.getConfig();
const sharePool = config?.distributionPool || ShareAmount.zero();
const burnedAmount = blackHole?.totalBurned || ShareAmount.zero();
const price = this.calculator.calculatePrice(sharePool, burnedAmount, circulationPool);
const effectiveDenominator = MiningCalculatorService.TOTAL_SHARES.value
.minus(burnedAmount.value)
.minus(circulationPool.value);
await this.priceSnapshotRepository.saveSnapshot({
snapshotTime,
price,
sharePool,
blackHoleAmount: burnedAmount,
circulationPool,
effectiveDenominator: new ShareAmount(effectiveDenominator),
});
private getCurrentSecond(): Date {
const now = new Date();
now.setMilliseconds(0);
return now;
}
/**
*
*
*/
private getCurrentMinute(): Date {
const now = new Date();

View File

@ -15,27 +15,8 @@ export class MiningCalculatorService {
// 目标销毁量: 10B
static readonly BURN_TARGET = new ShareAmount('10000000000');
// 减半周期: 2年 (分钟)
static readonly HALVING_PERIOD_MINUTES = 2 * 365 * 24 * 60;
/**
*
* @param remainingDistribution
* @param eraNumber
* @param remainingMinutesInEra
*/
calculateMinuteDistribution(
remainingDistribution: ShareAmount,
eraNumber: number,
remainingMinutesInEra: number,
): ShareAmount {
if (remainingDistribution.isZero() || remainingMinutesInEra <= 0) {
return ShareAmount.zero();
}
// 每分钟分配 = 剩余量 / 剩余分钟数
return remainingDistribution.divide(remainingMinutesInEra);
}
// 减半周期: 2年
static readonly HALVING_PERIOD_SECONDS = 2 * 365 * 24 * 60 * 60; // 63,072,000秒
/**
*
@ -52,33 +33,32 @@ export class MiningCalculatorService {
*
* @param userContribution
* @param totalContribution
* @param minuteDistribution
* @param secondDistribution
*/
calculateUserMiningReward(
userContribution: ShareAmount,
totalContribution: ShareAmount,
minuteDistribution: ShareAmount,
secondDistribution: ShareAmount,
): ShareAmount {
if (totalContribution.isZero() || userContribution.isZero()) {
return ShareAmount.zero();
}
// 用户收益 = 每分钟分配量 * (用户算力 / 总算力)
// 用户收益 = 每分配量 * (用户算力 / 总算力)
const ratio = userContribution.value.dividedBy(totalContribution.value);
return minuteDistribution.multiply(ratio);
return secondDistribution.multiply(ratio);
}
/**
*
* 设计目标: 假设只有黑洞和股池,1
* minuteBurn = (burnTarget - currentBurned) / remainingMinutes
*
* secondBurn = (burnTarget - currentBurned) / remainingSeconds
*/
calculateMinuteBurn(
calculateSecondBurn(
burnTarget: ShareAmount,
currentBurned: ShareAmount,
remainingMinutes: number,
remainingSeconds: number,
): ShareAmount {
if (remainingMinutes <= 0) {
if (remainingSeconds <= 0) {
return ShareAmount.zero();
}
@ -87,7 +67,7 @@ export class MiningCalculatorService {
return ShareAmount.zero();
}
return remaining.divide(remainingMinutes);
return remaining.divide(remainingSeconds);
}
/**
@ -123,23 +103,12 @@ export class MiningCalculatorService {
}
/**
*
*
*/
calculateRemainingMinutes(eraStartDate: Date, halvingPeriodMinutes: number): number {
calculateRemainingSeconds(eraStartDate: Date, halvingPeriodSeconds: number): number {
const now = new Date();
const elapsedMs = now.getTime() - eraStartDate.getTime();
const elapsedMinutes = Math.floor(elapsedMs / 60000);
return Math.max(0, halvingPeriodMinutes - elapsedMinutes);
}
/**
*
*
*/
calculateRemainingBurnMinutes(startDate: Date, totalMinutes: number): number {
const now = new Date();
const elapsedMs = now.getTime() - startDate.getTime();
const elapsedMinutes = Math.floor(elapsedMs / 60000);
return Math.max(0, totalMinutes - elapsedMinutes);
const elapsedSeconds = Math.floor(elapsedMs / 1000);
return Math.max(0, halvingPeriodSeconds - elapsedSeconds);
}
}

View File

@ -10,7 +10,7 @@ export interface MiningConfigEntity {
halvingPeriodYears: number;
currentEra: number;
eraStartDate: Date;
minuteDistribution: ShareAmount;
secondDistribution: ShareAmount;
isActive: boolean;
activatedAt: Date | null;
}
@ -40,7 +40,7 @@ export class MiningConfigRepository {
halvingPeriodYears: config.halvingPeriodYears,
currentEra: config.currentEra,
eraStartDate: config.eraStartDate,
minuteDistribution: config.minuteDistribution?.value,
secondDistribution: config.secondDistribution?.value,
isActive: config.isActive,
activatedAt: config.activatedAt,
},
@ -54,7 +54,7 @@ export class MiningConfigRepository {
halvingPeriodYears: config.halvingPeriodYears || 2,
currentEra: config.currentEra || 1,
eraStartDate: config.eraStartDate || new Date(),
minuteDistribution: config.minuteDistribution?.value || 0,
secondDistribution: config.secondDistribution?.value || 0,
isActive: config.isActive || false,
activatedAt: config.activatedAt,
},
@ -99,7 +99,7 @@ export class MiningConfigRepository {
halvingPeriodYears: record.halvingPeriodYears,
currentEra: record.currentEra,
eraStartDate: record.eraStartDate,
minuteDistribution: new ShareAmount(record.minuteDistribution),
secondDistribution: new ShareAmount(record.secondDistribution),
isActive: record.isActive,
activatedAt: record.activatedAt,
};

View File

@ -64,7 +64,8 @@ export class RedisService implements OnModuleInit, OnModuleDestroy {
async acquireLock(lockKey: string, ttlSeconds: number = 30): Promise<string | null> {
const lockValue = `${Date.now()}-${Math.random().toString(36).substring(7)}`;
const result = await this.client.set(lockKey, lockValue, 'EX', ttlSeconds, 'NX');
const ttlMs = Math.round(ttlSeconds * 1000);
const result = await this.client.set(lockKey, lockValue, 'PX', ttlMs, 'NX');
return result === 'OK' ? lockValue : null;
}
@ -87,4 +88,12 @@ export class RedisService implements OnModuleInit, OnModuleDestroy {
async incrByFloat(key: string, increment: number): Promise<string> {
return this.client.incrbyfloat(key, increment);
}
async keys(pattern: string): Promise<string[]> {
return this.client.keys(pattern);
}
async del(key: string): Promise<number> {
return this.client.del(key);
}
}

View File

@ -20,5 +20,7 @@
"paths": {
"@/*": ["src/*"]
}
}
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist", "prisma"]
}

View File

@ -30,14 +30,15 @@ WORKDIR /app
USER nestjs
COPY --chown=nestjs:nodejs package*.json ./
RUN npm ci --only=production && npm cache clean --force
COPY --chown=nestjs:nodejs tsconfig*.json ./
RUN npm ci --only=production && npm install ts-node typescript @types/node --save-dev && npm cache clean --force
COPY --chown=nestjs:nodejs prisma ./prisma/
RUN DATABASE_URL="postgresql://user:pass@localhost:5432/db" npx prisma generate
COPY --chown=nestjs:nodejs --from=builder /app/dist ./dist
RUN printf '#!/bin/sh\nset -e\necho "Running database migrations..."\nnpx prisma migrate deploy\necho "Starting application..."\nexec node dist/main.js\n' > /app/start.sh && chmod +x /app/start.sh
RUN printf '#!/bin/sh\nset -e\necho "Running database migrations..."\nnpx prisma migrate deploy\necho "Running database seed..."\nnpx prisma db seed || echo "Seed skipped or already applied"\necho "Starting application..."\nexec node dist/main.js\n' > /app/start.sh && chmod +x /app/start.sh
ENV NODE_ENV=production
ENV TZ=Asia/Shanghai

View File

@ -16,7 +16,8 @@
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate dev",
"prisma:migrate:prod": "prisma migrate deploy",
"prisma:studio": "prisma studio"
"prisma:studio": "prisma studio",
"prisma:seed": "ts-node prisma/seed.ts"
},
"dependencies": {
"@nestjs/common": "^10.3.0",
@ -38,6 +39,9 @@
"rxjs": "^7.8.1",
"swagger-ui-express": "^5.0.0"
},
"prisma": {
"seed": "ts-node prisma/seed.ts"
},
"devDependencies": {
"@nestjs/cli": "^10.2.1",
"@nestjs/schematics": "^10.0.3",

View File

@ -0,0 +1,144 @@
import { PrismaClient } from '@prisma/client';
import Decimal from 'decimal.js';
const prisma = new PrismaClient();
async function main() {
console.log('Seeding mining-wallet-service database...');
// 1. 初始化核心系统账户(总部、运营、手续费)
const systemAccounts = [
{ accountType: 'HEADQUARTERS', name: '总部账户', code: 'HQ' },
{ accountType: 'OPERATION', name: '运营账户', code: 'OP' },
{ accountType: 'FEE', name: '手续费账户', code: 'FEE' },
];
for (const account of systemAccounts) {
const existing = await prisma.systemAccount.findFirst({
where: { code: account.code },
});
if (!existing) {
const created = await prisma.systemAccount.create({
data: {
accountType: account.accountType as any,
name: account.name,
code: account.code,
isActive: true,
},
});
// 发布系统账户创建事件到 Outbox
await prisma.outboxEvent.create({
data: {
aggregateType: 'SystemAccount',
aggregateId: created.id,
eventType: 'WalletSystemAccountCreated',
topic: 'mining-wallet.system-account.created',
key: created.code,
payload: {
id: created.id,
accountType: created.accountType,
name: created.name,
code: created.code,
provinceId: null,
cityId: null,
shareBalance: 0,
usdtBalance: 0,
greenPointBalance: 0,
frozenShare: 0,
frozenUsdt: 0,
totalInflow: 0,
totalOutflow: 0,
blockchainAddress: null,
isActive: created.isActive,
},
},
});
console.log(`Created system account: ${account.code}`);
} else {
console.log(`System account already exists: ${account.code}`);
}
}
// 2. 初始化池账户(积分股池、黑洞池、流通池)
const poolAccounts = [
{
poolType: 'SHARE_POOL',
name: '积分股池',
balance: new Decimal('100000000'), // 1亿初始发行量
description: '挖矿奖励的来源池,总发行量',
},
{
poolType: 'BLACK_HOLE_POOL',
name: '黑洞池',
balance: new Decimal('0'),
targetBurn: new Decimal('50000000'), // 目标销毁5000万
description: '销毁的积分股,用于减少流通量',
},
{
poolType: 'CIRCULATION_POOL',
name: '流通池',
balance: new Decimal('0'),
description: '市场流通的积分股',
},
];
for (const pool of poolAccounts) {
const existing = await prisma.poolAccount.findFirst({
where: { poolType: pool.poolType as any },
});
if (!existing) {
const created = await prisma.poolAccount.create({
data: {
poolType: pool.poolType as any,
name: pool.name,
balance: pool.balance,
targetBurn: pool.targetBurn,
remainingBurn: pool.targetBurn,
description: pool.description,
isActive: true,
},
});
// 发布池账户创建事件到 Outbox
await prisma.outboxEvent.create({
data: {
aggregateType: 'PoolAccount',
aggregateId: created.id,
eventType: 'WalletPoolAccountCreated',
topic: 'mining-wallet.pool-account.created',
key: created.poolType,
payload: {
id: created.id,
poolType: created.poolType,
name: created.name,
balance: created.balance.toString(),
totalInflow: 0,
totalOutflow: 0,
targetBurn: created.targetBurn?.toString() || null,
remainingBurn: created.remainingBurn?.toString() || null,
isActive: created.isActive,
},
},
});
console.log(`Created pool account: ${pool.poolType}`);
} else {
console.log(`Pool account already exists: ${pool.poolType}`);
}
}
console.log('Seeding completed!');
}
main()
.catch((e) => {
console.error('Seeding failed:', e);
process.exit(1);
})
.finally(async () => {
await prisma.$disconnect();
});

View File

@ -1,7 +1,7 @@
import { Controller, Get, Post, Body, Param, Query } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiResponse, ApiBearerAuth, ApiQuery } from '@nestjs/swagger';
import { PoolAccountService } from '../../application/services/pool-account.service';
import { AdminOnly } from '../../shared/guards/jwt-auth.guard';
import { AdminOnly, Public } from '../../shared/guards/jwt-auth.guard';
import { PoolAccountType, TransactionType } from '@prisma/client';
import Decimal from 'decimal.js';
@ -73,8 +73,8 @@ export class PoolAccountController {
}
@Post('initialize')
@AdminOnly()
@ApiOperation({ summary: '初始化池账户' })
@Public()
@ApiOperation({ summary: '初始化池账户(仅限内网调用)' })
@ApiResponse({ status: 201, description: '池账户初始化成功' })
async initialize(@Body() dto: InitializePoolsDto) {
return this.poolAccountService.initializePools({

View File

@ -1,7 +1,7 @@
import { Controller, Get, Post, Body, Param, Query } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiResponse, ApiBearerAuth } from '@nestjs/swagger';
import { SystemAccountService } from '../../application/services/system-account.service';
import { AdminOnly } from '../../shared/guards/jwt-auth.guard';
import { AdminOnly, Public } from '../../shared/guards/jwt-auth.guard';
import { SystemAccountType } from '@prisma/client';
class InitializeSystemAccountsDto {
@ -47,8 +47,8 @@ export class SystemAccountController {
}
@Post('initialize')
@AdminOnly()
@ApiOperation({ summary: '初始化核心系统账户' })
@Public()
@ApiOperation({ summary: '初始化核心系统账户(仅限内网调用)' })
@ApiResponse({ status: 201, description: '系统账户初始化成功' })
async initialize(@Body() dto: InitializeSystemAccountsDto) {
return this.systemAccountService.initializeCoreAccounts(dto);

View File

@ -17,6 +17,11 @@ import { UserRegisteredConsumer } from '../infrastructure/kafka/consumers/user-r
@Module({
imports: [ScheduleModule.forRoot()],
controllers: [
// Kafka Consumers (微服务消息处理器需要是 Controller)
ContributionDistributionConsumer,
UserRegisteredConsumer,
],
providers: [
// Services
SystemAccountService,
@ -26,9 +31,6 @@ import { UserRegisteredConsumer } from '../infrastructure/kafka/consumers/user-r
// Schedulers
OutboxScheduler,
ContributionExpiryScheduler,
// Consumers
ContributionDistributionConsumer,
UserRegisteredConsumer,
],
exports: [
SystemAccountService,

View File

@ -51,6 +51,7 @@ export class ContributionWalletService {
},
});
const isNewWallet = !wallet;
if (!wallet) {
wallet = await tx.userWallet.create({
data: {
@ -60,13 +61,34 @@ export class ContributionWalletService {
frozenBalance: new Decimal(0),
},
});
// 发布 UserWalletCreated 事件
await tx.outboxEvent.create({
data: {
aggregateType: 'UserWallet',
aggregateId: wallet.id,
eventType: 'UserWalletCreated',
topic: 'cdc.mining-wallet.outbox',
key: input.accountSequence,
payload: {
id: wallet.id,
accountSequence: wallet.accountSequence,
walletType: wallet.walletType,
balance: '0',
frozenBalance: '0',
totalInflow: 0,
totalOutflow: 0,
isActive: true,
},
},
});
}
const balanceBefore = new Decimal(wallet.balance.toString());
const balanceAfter = balanceBefore.plus(input.amount);
// 2. 更新钱包余额
await tx.userWallet.update({
const updatedWallet = await tx.userWallet.update({
where: { id: wallet.id },
data: {
balance: balanceAfter,
@ -74,6 +96,27 @@ export class ContributionWalletService {
},
});
// 发布 UserWalletUpdated 事件(用于 mining-admin-service 同步)
await tx.outboxEvent.create({
data: {
aggregateType: 'UserWallet',
aggregateId: wallet.id,
eventType: 'UserWalletUpdated',
topic: 'cdc.mining-wallet.outbox',
key: input.accountSequence,
payload: {
id: wallet.id,
accountSequence: wallet.accountSequence,
walletType: wallet.walletType,
balance: balanceAfter.toString(),
frozenBalance: updatedWallet.frozenBalance.toString(),
totalInflow: updatedWallet.totalInflow.toString(),
totalOutflow: updatedWallet.totalOutflow.toString(),
isActive: true,
},
},
});
// 3. 创建交易记录(分类账)
const transaction = await tx.userWalletTransaction.create({
data: {
@ -151,15 +194,25 @@ export class ContributionWalletService {
};
}
const systemAccount = await tx.systemAccount.findFirst({
let systemAccount = await tx.systemAccount.findFirst({
where: whereClause,
});
// 如果找不到,尝试自动创建省/市级系统账户
if (!systemAccount) {
this.logger.warn(
`System account not found: ${input.accountType}, province: ${input.provinceCode}, city: ${input.cityCode}`,
systemAccount = await this.createSystemAccountIfNeeded(
tx,
input.accountType,
input.provinceCode,
input.cityCode,
);
return;
if (!systemAccount) {
this.logger.warn(
`Failed to create system account: ${input.accountType}, province: ${input.provinceCode}, city: ${input.cityCode}`,
);
return;
}
}
const balanceBefore = new Decimal(
@ -237,7 +290,7 @@ export class ContributionWalletService {
}
// 更新钱包余额
await tx.userWallet.update({
const updatedWallet = await tx.userWallet.update({
where: { id: wallet.id },
data: {
balance: balanceAfter,
@ -245,6 +298,27 @@ export class ContributionWalletService {
},
});
// 发布 UserWalletUpdated 事件(用于 mining-admin-service 同步)
await tx.outboxEvent.create({
data: {
aggregateType: 'UserWallet',
aggregateId: wallet.id,
eventType: 'UserWalletUpdated',
topic: 'cdc.mining-wallet.outbox',
key: accountSequence,
payload: {
id: wallet.id,
accountSequence: wallet.accountSequence,
walletType: wallet.walletType,
balance: balanceAfter.toString(),
frozenBalance: updatedWallet.frozenBalance.toString(),
totalInflow: updatedWallet.totalInflow.toString(),
totalOutflow: updatedWallet.totalOutflow.toString(),
isActive: true,
},
},
});
// 创建过期交易记录
await tx.userWalletTransaction.create({
data: {
@ -281,4 +355,164 @@ export class ContributionWalletService {
};
return `${typeMap[input.contributionType]}, 来源认种: ${input.sourceAdoptionId}, 认种人: ${input.sourceAccountSequence}`;
}
/**
* /
* /
*/
private async createSystemAccountIfNeeded(
tx: any,
accountType: string,
provinceCode?: string,
cityCode?: string,
): Promise<any | null> {
// 只处理省/市级账户的自动创建
if (accountType === 'PROVINCE' && provinceCode) {
// 先找或创建省份
let province = await tx.province.findUnique({
where: { code: provinceCode },
});
if (!province) {
province = await tx.province.create({
data: {
code: provinceCode,
name: provinceCode,
status: 'ACTIVE',
},
});
this.logger.log(`Auto-created province: ${provinceCode}`);
}
// 创建省级系统账户
const account = await tx.systemAccount.create({
data: {
accountType: 'PROVINCE',
name: `${province.name}账户`,
code: `PROV-${provinceCode}`,
provinceId: province.id,
isActive: true,
},
});
this.logger.log(`Auto-created province system account: ${account.code}`);
// 发布系统账户创建事件到 Outbox
await tx.outboxEvent.create({
data: {
aggregateType: 'SystemAccount',
aggregateId: account.id,
eventType: 'WalletSystemAccountCreated',
topic: 'mining-wallet.system-account.created',
key: account.code,
payload: {
id: account.id,
accountType: account.accountType,
name: account.name,
code: account.code,
provinceId: account.provinceId,
cityId: null,
shareBalance: 0,
usdtBalance: 0,
greenPointBalance: 0,
frozenShare: 0,
frozenUsdt: 0,
totalInflow: 0,
totalOutflow: 0,
blockchainAddress: null,
isActive: account.isActive,
},
},
});
return account;
}
if (accountType === 'CITY' && cityCode) {
// 先找城市
let city = await tx.city.findUnique({
where: { code: cityCode },
});
if (!city) {
// 城市不存在,需要先有省份
if (!provinceCode) {
this.logger.warn(`Cannot create city without provinceCode: ${cityCode}`);
return null;
}
// 找或创建省份
let province = await tx.province.findUnique({
where: { code: provinceCode },
});
if (!province) {
province = await tx.province.create({
data: {
code: provinceCode,
name: provinceCode,
status: 'ACTIVE',
},
});
this.logger.log(`Auto-created province: ${provinceCode}`);
}
// 创建城市
city = await tx.city.create({
data: {
code: cityCode,
name: cityCode,
provinceId: province.id,
status: 'ACTIVE',
},
});
this.logger.log(`Auto-created city: ${cityCode}`);
}
// 创建市级系统账户
const account = await tx.systemAccount.create({
data: {
accountType: 'CITY',
name: `${city.name}账户`,
code: `CITY-${cityCode}`,
provinceId: city.provinceId,
cityId: city.id,
isActive: true,
},
});
this.logger.log(`Auto-created city system account: ${account.code}`);
// 发布系统账户创建事件到 Outbox
await tx.outboxEvent.create({
data: {
aggregateType: 'SystemAccount',
aggregateId: account.id,
eventType: 'WalletSystemAccountCreated',
topic: 'mining-wallet.system-account.created',
key: account.code,
payload: {
id: account.id,
accountType: account.accountType,
name: account.name,
code: account.code,
provinceId: account.provinceId,
cityId: account.cityId,
shareBalance: 0,
usdtBalance: 0,
greenPointBalance: 0,
frozenShare: 0,
frozenUsdt: 0,
totalInflow: 0,
totalOutflow: 0,
blockchainAddress: null,
isActive: account.isActive,
},
},
});
return account;
}
// 其他类型HEADQUARTERS, OPERATION, FEE不自动创建需要在 seed 中初始化
return null;
}
}

View File

@ -1,4 +1,4 @@
import { Injectable, Logger, OnModuleInit } from '@nestjs/common';
import { Controller, Logger, OnModuleInit } from '@nestjs/common';
import { EventPattern, Payload } from '@nestjs/microservices';
import Decimal from 'decimal.js';
import { PrismaService } from '../../persistence/prisma/prisma.service';
@ -9,12 +9,14 @@ import { SystemAccountService } from '../../../application/services/system-accou
import {
ContributionDistributionCompletedEvent,
ContributionDistributionPayload,
BonusClaimedEvent,
BonusClaimedPayload,
} from '../events/contribution-distribution.event';
// 4小时 TTL
const IDEMPOTENCY_TTL_SECONDS = 4 * 60 * 60;
@Injectable()
@Controller()
export class ContributionDistributionConsumer implements OnModuleInit {
private readonly logger = new Logger(ContributionDistributionConsumer.name);
@ -114,6 +116,65 @@ export class ContributionDistributionConsumer implements OnModuleInit {
}
}
/**
*
*
*/
@EventPattern('contribution.bonus.claimed')
async handleBonusClaimed(@Payload() message: any): Promise<void> {
const event: BonusClaimedEvent = message.value || message;
const eventId = event.eventId || message.eventId;
if (!eventId) {
this.logger.warn('Received BonusClaimed event without eventId, skipping');
return;
}
this.logger.debug(`Processing bonus claim event: ${eventId}`);
// 幂等性检查
if (await this.isEventProcessed(eventId)) {
this.logger.debug(`Event ${eventId} already processed, skipping`);
return;
}
try {
await this.processBonusClaim(event.payload);
// 标记为已处理
await this.markEventProcessed(eventId, event.eventType);
this.logger.log(
`Bonus claim for ${event.payload.accountSequence} T${event.payload.bonusTier} processed: ` +
`${event.payload.claimedCount} records`,
);
} catch (error) {
this.logger.error(
`Failed to process bonus claim for ${event.payload.accountSequence}`,
error instanceof Error ? error.stack : error,
);
throw error; // 让 Kafka 重试
}
}
/**
*
*/
private async processBonusClaim(payload: BonusClaimedPayload): Promise<void> {
for (const contrib of payload.userContributions) {
await this.contributionWalletService.creditContribution({
accountSequence: contrib.accountSequence,
amount: new Decimal(contrib.amount),
contributionType: contrib.contributionType,
bonusTier: contrib.bonusTier,
effectiveDate: new Date(contrib.effectiveDate),
expireDate: new Date(contrib.expireDate),
sourceAdoptionId: contrib.sourceAdoptionId,
sourceAccountSequence: contrib.sourceAccountSequence,
});
}
}
/**
* - Redis + DB 4
*/

View File

@ -1,4 +1,4 @@
import { Injectable, Logger, OnModuleInit } from '@nestjs/common';
import { Controller, Logger, OnModuleInit } from '@nestjs/common';
import { EventPattern, Payload } from '@nestjs/microservices';
import { RedisService } from '../../redis/redis.service';
import { ProcessedEventRepository } from '../../persistence/repositories/processed-event.repository';
@ -8,7 +8,7 @@ import { UserRegisteredEvent } from '../events/contribution-distribution.event';
// 4小时 TTL
const IDEMPOTENCY_TTL_SECONDS = 4 * 60 * 60;
@Injectable()
@Controller()
export class UserRegisteredConsumer implements OnModuleInit {
private readonly logger = new Logger(UserRegisteredConsumer.name);

View File

@ -54,6 +54,36 @@ export interface UnallocatedContributionItem {
bonusTier?: number;
}
/**
*
* contribution-service
*/
export interface BonusClaimedEvent {
eventType: 'BonusClaimed';
eventId: string;
timestamp: string;
payload: BonusClaimedPayload;
}
export interface BonusClaimedPayload {
accountSequence: string;
bonusTier: number;
claimedCount: number;
userContributions: BonusClaimedContributionItem[];
}
export interface BonusClaimedContributionItem {
accountSequence: string;
contributionType: 'TEAM_BONUS';
amount: string;
bonusTier: number;
effectiveDate: string;
expireDate: string;
sourceAdoptionId: string;
sourceAccountSequence: string;
isBackfill: boolean;
}
/**
*
* auth-service

View File

@ -56,6 +56,10 @@ async function bootstrap() {
consumer: {
groupId: 'mining-wallet-service-group',
},
subscribe: {
// 显式订阅需要消费的 topics
fromBeginning: true,
},
},
});

View File

@ -20,5 +20,7 @@
"paths": {
"@/*": ["src/*"]
}
}
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist", "prisma"]
}

View File

@ -47,14 +47,14 @@ CREATE TABLE "orders" (
CREATE TABLE "trades" (
"id" TEXT NOT NULL,
"tradeNo" TEXT NOT NULL,
"buyOrderId" TEXT NOT NULL,
"sellOrderId" TEXT NOT NULL,
"buyerSequence" TEXT NOT NULL,
"sellerSequence" TEXT NOT NULL,
"buy_order_id" TEXT NOT NULL,
"sell_order_id" TEXT NOT NULL,
"buyer_sequence" TEXT NOT NULL,
"seller_sequence" TEXT NOT NULL,
"price" DECIMAL(30,18) NOT NULL,
"quantity" DECIMAL(30,8) NOT NULL,
"amount" DECIMAL(30,8) NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "trades_pkey" PRIMARY KEY ("id")
);
@ -229,13 +229,13 @@ CREATE INDEX "orders_createdAt_idx" ON "orders"("createdAt" DESC);
CREATE UNIQUE INDEX "trades_tradeNo_key" ON "trades"("tradeNo");
-- CreateIndex
CREATE INDEX "trades_buyerSequence_idx" ON "trades"("buyerSequence");
CREATE INDEX "trades_buyer_sequence_idx" ON "trades"("buyer_sequence");
-- CreateIndex
CREATE INDEX "trades_sellerSequence_idx" ON "trades"("sellerSequence");
CREATE INDEX "trades_seller_sequence_idx" ON "trades"("seller_sequence");
-- CreateIndex
CREATE INDEX "trades_createdAt_idx" ON "trades"("createdAt" DESC);
CREATE INDEX "trades_created_at_idx" ON "trades"("created_at" DESC);
-- CreateIndex
CREATE INDEX "trading_transactions_accountSequence_createdAt_idx" ON "trading_transactions"("accountSequence", "createdAt" DESC);
@ -307,7 +307,7 @@ CREATE INDEX "outbox_events_created_at_idx" ON "outbox_events"("created_at");
ALTER TABLE "orders" ADD CONSTRAINT "orders_accountSequence_fkey" FOREIGN KEY ("accountSequence") REFERENCES "trading_accounts"("accountSequence") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "trades" ADD CONSTRAINT "trades_buyOrderId_fkey" FOREIGN KEY ("buyOrderId") REFERENCES "orders"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
ALTER TABLE "trades" ADD CONSTRAINT "trades_buy_order_id_fkey" FOREIGN KEY ("buy_order_id") REFERENCES "orders"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "trading_transactions" ADD CONSTRAINT "trading_transactions_accountSequence_fkey" FOREIGN KEY ("accountSequence") REFERENCES "trading_accounts"("accountSequence") ON DELETE RESTRICT ON UPDATE CASCADE;

View File

@ -0,0 +1,139 @@
-- ============================================================================
-- trading-service 添加交易销毁系统
-- 包含:交易配置、黑洞账户、积分股池、价格快照、订单销毁字段
-- ============================================================================
-- ==================== 交易配置表 ====================
-- CreateTable
CREATE TABLE "trading_configs" (
"id" TEXT NOT NULL,
"total_shares" DECIMAL(30,8) NOT NULL DEFAULT 100020000000,
"burn_target" DECIMAL(30,8) NOT NULL DEFAULT 10000000000,
"burn_period_minutes" INTEGER NOT NULL DEFAULT 2102400,
"minute_burn_rate" DECIMAL(30,18) NOT NULL DEFAULT 4756.468797564687,
"is_active" BOOLEAN NOT NULL DEFAULT false,
"activated_at" TIMESTAMP(3),
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL,
CONSTRAINT "trading_configs_pkey" PRIMARY KEY ("id")
);
-- ==================== 黑洞账户(销毁池)====================
-- CreateTable
CREATE TABLE "black_holes" (
"id" TEXT NOT NULL,
"total_burned" DECIMAL(30,8) NOT NULL DEFAULT 0,
"target_burn" DECIMAL(30,8) NOT NULL,
"remaining_burn" DECIMAL(30,8) NOT NULL,
"last_burn_minute" TIMESTAMP(3),
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL,
CONSTRAINT "black_holes_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "burn_records" (
"id" TEXT NOT NULL,
"black_hole_id" TEXT NOT NULL,
"burn_minute" TIMESTAMP(3) NOT NULL,
"burn_amount" DECIMAL(30,18) NOT NULL,
"remaining_target" DECIMAL(30,8) NOT NULL,
"source_type" TEXT,
"source_account_seq" TEXT,
"source_order_no" TEXT,
"memo" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "burn_records_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE INDEX "burn_records_burn_minute_idx" ON "burn_records"("burn_minute");
-- CreateIndex
CREATE INDEX "burn_records_source_account_seq_idx" ON "burn_records"("source_account_seq");
-- CreateIndex
CREATE INDEX "burn_records_source_order_no_idx" ON "burn_records"("source_order_no");
-- CreateIndex
CREATE INDEX "burn_records_source_type_idx" ON "burn_records"("source_type");
-- AddForeignKey
ALTER TABLE "burn_records" ADD CONSTRAINT "burn_records_black_hole_id_fkey" FOREIGN KEY ("black_hole_id") REFERENCES "black_holes"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- ==================== 积分股池(绿积分池)====================
-- CreateTable
CREATE TABLE "share_pools" (
"id" TEXT NOT NULL,
"green_points" DECIMAL(30,8) NOT NULL DEFAULT 0,
"total_inflow" DECIMAL(30,8) NOT NULL DEFAULT 0,
"total_outflow" DECIMAL(30,8) NOT NULL DEFAULT 0,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL,
CONSTRAINT "share_pools_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "share_pool_transactions" (
"id" TEXT NOT NULL,
"pool_id" TEXT NOT NULL,
"type" TEXT NOT NULL,
"amount" DECIMAL(30,8) NOT NULL,
"balance_before" DECIMAL(30,8) NOT NULL,
"balance_after" DECIMAL(30,8) NOT NULL,
"reference_id" TEXT,
"reference_type" TEXT,
"memo" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "share_pool_transactions_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE INDEX "share_pool_transactions_pool_id_created_at_idx" ON "share_pool_transactions"("pool_id", "created_at" DESC);
-- AddForeignKey
ALTER TABLE "share_pool_transactions" ADD CONSTRAINT "share_pool_transactions_pool_id_fkey" FOREIGN KEY ("pool_id") REFERENCES "share_pools"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- ==================== 价格快照 ====================
-- CreateTable
CREATE TABLE "price_snapshots" (
"id" TEXT NOT NULL,
"snapshot_time" TIMESTAMP(3) NOT NULL,
"price" DECIMAL(30,18) NOT NULL,
"green_points" DECIMAL(30,8) NOT NULL,
"black_hole_amount" DECIMAL(30,8) NOT NULL,
"circulation_pool" DECIMAL(30,8) NOT NULL,
"effective_denominator" DECIMAL(30,8) NOT NULL,
"minute_burn_rate" DECIMAL(30,18) NOT NULL,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "price_snapshots_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "price_snapshots_snapshot_time_key" ON "price_snapshots"("snapshot_time");
-- CreateIndex
CREATE INDEX "price_snapshots_snapshot_time_idx" ON "price_snapshots"("snapshot_time" DESC);
-- ==================== 订单表添加销毁相关字段 ====================
-- AlterTable: 添加销毁相关字段到 orders 表
ALTER TABLE "orders" ADD COLUMN "burn_quantity" DECIMAL(30,8) NOT NULL DEFAULT 0;
ALTER TABLE "orders" ADD COLUMN "burn_multiplier" DECIMAL(30,18) NOT NULL DEFAULT 0;
ALTER TABLE "orders" ADD COLUMN "effective_quantity" DECIMAL(30,8) NOT NULL DEFAULT 0;
-- ==================== 成交记录表添加销毁相关字段 ====================
-- 添加销毁相关字段到 trades 表
ALTER TABLE "trades" ADD COLUMN "burn_quantity" DECIMAL(30,8) NOT NULL DEFAULT 0;
ALTER TABLE "trades" ADD COLUMN "effective_qty" DECIMAL(30,8) NOT NULL DEFAULT 0;

View File

@ -0,0 +1,23 @@
-- ============================================================================
-- trading-service 添加已处理事件表(幂等性支持)
-- ============================================================================
-- CreateTable
CREATE TABLE "processed_events" (
"id" TEXT NOT NULL,
"event_id" TEXT NOT NULL,
"event_type" TEXT NOT NULL,
"source_service" TEXT NOT NULL,
"processed_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "processed_events_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "processed_events_event_id_key" ON "processed_events"("event_id");
-- CreateIndex
CREATE INDEX "processed_events_event_id_idx" ON "processed_events"("event_id");
-- CreateIndex
CREATE INDEX "processed_events_processed_at_idx" ON "processed_events"("processed_at");

View File

@ -7,6 +7,125 @@ datasource db {
url = env("DATABASE_URL")
}
// ==================== 交易配置 ====================
// 交易全局配置
model TradingConfig {
id String @id @default(uuid())
// 总积分股数量: 100.02B
totalShares Decimal @default(100020000000) @map("total_shares") @db.Decimal(30, 8)
// 目标销毁量: 100亿 (4年销毁完)
burnTarget Decimal @default(10000000000) @map("burn_target") @db.Decimal(30, 8)
// 销毁周期: 4年 (分钟数) 365*4*1440 = 2102400
burnPeriodMinutes Int @default(2102400) @map("burn_period_minutes")
// 每分钟基础销毁量: 100亿÷(365*4*1440) = 4756.468797564687
minuteBurnRate Decimal @default(4756.468797564687) @map("minute_burn_rate") @db.Decimal(30, 18)
// 是否启用交易
isActive Boolean @default(false) @map("is_active")
// 启动时间
activatedAt DateTime? @map("activated_at")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("trading_configs")
}
// ==================== 黑洞账户(销毁池)====================
// 黑洞账户
model BlackHole {
id String @id @default(uuid())
totalBurned Decimal @default(0) @map("total_burned") @db.Decimal(30, 8) // 已销毁总量
targetBurn Decimal @map("target_burn") @db.Decimal(30, 8) // 目标销毁量 (10B)
remainingBurn Decimal @map("remaining_burn") @db.Decimal(30, 8) // 剩余待销毁
lastBurnMinute DateTime? @map("last_burn_minute")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
records BurnRecord[]
@@map("black_holes")
}
// 销毁记录
model BurnRecord {
id String @id @default(uuid())
blackHoleId String @map("black_hole_id")
burnMinute DateTime @map("burn_minute")
burnAmount Decimal @map("burn_amount") @db.Decimal(30, 18)
remainingTarget Decimal @map("remaining_target") @db.Decimal(30, 8) // 销毁后剩余目标
// 来源信息
sourceType String? @map("source_type") // MINUTE_BURN (每分钟销毁), SELL_BURN (卖出销毁)
sourceAccountSeq String? @map("source_account_seq") // 来源账户序列号(卖出时)
sourceOrderNo String? @map("source_order_no") // 来源订单号(卖出时)
memo String? @db.Text
createdAt DateTime @default(now()) @map("created_at")
blackHole BlackHole @relation(fields: [blackHoleId], references: [id])
@@index([burnMinute])
@@index([sourceAccountSeq])
@@index([sourceOrderNo])
@@index([sourceType])
@@map("burn_records")
}
// ==================== 积分股池(绿积分池)====================
// 积分股池(存储绿积分用于计算价格)
model SharePool {
id String @id @default(uuid())
// 绿积分总量(用于价格计算的分子)
greenPoints Decimal @default(0) @map("green_points") @db.Decimal(30, 8)
totalInflow Decimal @default(0) @map("total_inflow") @db.Decimal(30, 8)
totalOutflow Decimal @default(0) @map("total_outflow") @db.Decimal(30, 8)
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
transactions SharePoolTransaction[]
@@map("share_pools")
}
// 积分股池交易记录
model SharePoolTransaction {
id String @id @default(uuid())
poolId String @map("pool_id")
type String // INJECT (注入), TRADE_IN (交易流入), TRADE_OUT (交易流出)
amount Decimal @db.Decimal(30, 8)
balanceBefore Decimal @map("balance_before") @db.Decimal(30, 8)
balanceAfter Decimal @map("balance_after") @db.Decimal(30, 8)
referenceId String? @map("reference_id")
referenceType String? @map("reference_type")
memo String? @db.Text
createdAt DateTime @default(now()) @map("created_at")
pool SharePool @relation(fields: [poolId], references: [id])
@@index([poolId, createdAt(sort: Desc)])
@@map("share_pool_transactions")
}
// ==================== 价格快照 ====================
// 价格快照(每分钟)
model PriceSnapshot {
id String @id @default(uuid())
snapshotTime DateTime @unique @map("snapshot_time")
price Decimal @db.Decimal(30, 18) // 当时价格
greenPoints Decimal @map("green_points") @db.Decimal(30, 8) // 绿积分(股池)
blackHoleAmount Decimal @map("black_hole_amount") @db.Decimal(30, 8) // 黑洞数量
circulationPool Decimal @map("circulation_pool") @db.Decimal(30, 8) // 流通池
effectiveDenominator Decimal @map("effective_denominator") @db.Decimal(30, 8) // 有效分母
minuteBurnRate Decimal @map("minute_burn_rate") @db.Decimal(30, 18) // 当时的每分钟销毁率
createdAt DateTime @default(now()) @map("created_at")
@@index([snapshotTime(sort: Desc)])
@@map("price_snapshots")
}
// ==================== 交易账户 ====================
// 用户交易账户
@ -43,6 +162,10 @@ model Order {
remainingQuantity Decimal @db.Decimal(30, 8) // 剩余数量
averagePrice Decimal @default(0) @db.Decimal(30, 18) // 平均成交价
totalAmount Decimal @default(0) @db.Decimal(30, 8) // 总成交金额
// 卖出销毁相关字段
burnQuantity Decimal @default(0) @map("burn_quantity") @db.Decimal(30, 8) // 卖出销毁量
burnMultiplier Decimal @default(0) @map("burn_multiplier") @db.Decimal(30, 18) // 销毁倍数
effectiveQuantity Decimal @default(0) @map("effective_quantity") @db.Decimal(30, 8) // 有效卖出量(含销毁)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
cancelledAt DateTime?
@ -61,14 +184,16 @@ model Order {
model Trade {
id String @id @default(uuid())
tradeNo String @unique
buyOrderId String
sellOrderId String
buyerSequence String
sellerSequence String
buyOrderId String @map("buy_order_id")
sellOrderId String @map("sell_order_id")
buyerSequence String @map("buyer_sequence")
sellerSequence String @map("seller_sequence")
price Decimal @db.Decimal(30, 18)
quantity Decimal @db.Decimal(30, 8)
amount Decimal @db.Decimal(30, 8) // price * quantity
createdAt DateTime @default(now())
quantity Decimal @db.Decimal(30, 8) // 实际成交量
burnQuantity Decimal @default(0) @map("burn_quantity") @db.Decimal(30, 8) // 卖出销毁量
effectiveQty Decimal @default(0) @map("effective_qty") @db.Decimal(30, 8) // 有效量quantity + burnQuantity
amount Decimal @db.Decimal(30, 8) // effectiveQty * price卖出交易额
createdAt DateTime @default(now()) @map("created_at")
buyOrder Order @relation(fields: [buyOrderId], references: [id])
@ -281,3 +406,18 @@ model OutboxEvent {
@@index([createdAt])
@@map("outbox_events")
}
// ==================== 已处理事件(幂等性)====================
// 已处理事件记录(用于消费者幂等性检查)
model ProcessedEvent {
id String @id @default(uuid())
eventId String @unique @map("event_id") // 事件唯一ID
eventType String @map("event_type") // 事件类型
sourceService String @map("source_service") // 来源服务
processedAt DateTime @default(now()) @map("processed_at")
@@index([eventId])
@@index([processedAt])
@@map("processed_events")
}

View File

@ -5,9 +5,20 @@ import { TradingController } from './controllers/trading.controller';
import { TransferController } from './controllers/transfer.controller';
import { HealthController } from './controllers/health.controller';
import { AdminController } from './controllers/admin.controller';
import { PriceController } from './controllers/price.controller';
import { BurnController } from './controllers/burn.controller';
import { AssetController } from './controllers/asset.controller';
@Module({
imports: [ApplicationModule, InfrastructureModule],
controllers: [TradingController, TransferController, HealthController, AdminController],
controllers: [
TradingController,
TransferController,
HealthController,
AdminController,
PriceController,
BurnController,
AssetController,
],
})
export class ApiModule {}

View File

@ -0,0 +1,68 @@
import { Controller, Get, Param, Query, Req } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiParam, ApiQuery, ApiBearerAuth } from '@nestjs/swagger';
import { AssetService } from '../../application/services/asset.service';
import { Public } from '../../shared/guards/jwt-auth.guard';
@ApiTags('Asset')
@ApiBearerAuth()
@Controller('asset')
export class AssetController {
constructor(private readonly assetService: AssetService) {}
@Get('my')
@ApiOperation({ summary: '获取我的资产显示' })
@ApiQuery({ name: 'dailyAllocation', required: false, type: String, description: '每日分配量(可选)' })
async getMyAsset(@Req() req: any, @Query('dailyAllocation') dailyAllocation?: string) {
const accountSequence = req.user?.accountSequence;
if (!accountSequence) {
throw new Error('Unauthorized');
}
const asset = await this.assetService.getAssetDisplay(accountSequence, dailyAllocation);
if (!asset) {
throw new Error('Account not found');
}
return asset;
}
@Get('account/:accountSequence')
@Public()
@ApiOperation({ summary: '获取指定账户资产显示' })
@ApiParam({ name: 'accountSequence', description: '账户序号' })
@ApiQuery({ name: 'dailyAllocation', required: false, type: String, description: '每日分配量(可选)' })
async getAccountAsset(
@Param('accountSequence') accountSequence: string,
@Query('dailyAllocation') dailyAllocation?: string,
) {
const asset = await this.assetService.getAssetDisplay(accountSequence, dailyAllocation);
if (!asset) {
return { message: 'Account not found' };
}
return asset;
}
@Get('estimate-sell')
@Public()
@ApiOperation({ summary: '预估卖出收益' })
@ApiQuery({ name: 'quantity', required: true, type: String, description: '卖出数量' })
async estimateSellProceeds(@Query('quantity') quantity: string) {
return this.assetService.estimateSellProceeds(quantity);
}
@Get('market')
@Public()
@ApiOperation({ summary: '获取市场概览' })
async getMarketOverview() {
return this.assetService.getMarketOverview();
}
@Get('growth-per-second')
@Public()
@ApiOperation({ summary: '计算资产每秒增长量' })
@ApiQuery({ name: 'dailyAllocation', required: true, type: String, description: '每日分配量' })
async calculateGrowthPerSecond(@Query('dailyAllocation') dailyAllocation: string) {
const perSecond = this.assetService.calculateAssetGrowthPerSecond(dailyAllocation);
return { dailyAllocation, assetGrowthPerSecond: perSecond };
}
}

View File

@ -0,0 +1,31 @@
import { Controller, Get, Query } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiQuery } from '@nestjs/swagger';
import { BurnService } from '../../application/services/burn.service';
import { Public } from '../../shared/guards/jwt-auth.guard';
@ApiTags('Burn')
@Controller('burn')
export class BurnController {
constructor(private readonly burnService: BurnService) {}
@Get('status')
@Public()
@ApiOperation({ summary: '获取销毁状态' })
async getBurnStatus() {
return this.burnService.getBurnStatus();
}
@Get('records')
@Public()
@ApiOperation({ summary: '获取销毁记录' })
@ApiQuery({ name: 'page', required: false, type: Number })
@ApiQuery({ name: 'pageSize', required: false, type: Number })
@ApiQuery({ name: 'sourceType', required: false, enum: ['MINUTE_BURN', 'SELL_BURN'] })
async getBurnRecords(
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
@Query('sourceType') sourceType?: 'MINUTE_BURN' | 'SELL_BURN',
) {
return this.burnService.getBurnRecords(page ?? 1, pageSize ?? 50, sourceType);
}
}

View File

@ -2,9 +2,11 @@ import { Controller, Get } from '@nestjs/common';
import { ApiTags, ApiOperation } from '@nestjs/swagger';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
import { RedisService } from '../../infrastructure/redis/redis.service';
import { Public } from '../../shared/guards/jwt-auth.guard';
@ApiTags('Health')
@Controller('health')
@Public()
export class HealthController {
constructor(
private readonly prisma: PrismaService,

View File

@ -0,0 +1,46 @@
import { Controller, Get, Query } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiQuery, ApiBearerAuth } from '@nestjs/swagger';
import { PriceService } from '../../application/services/price.service';
import { Public } from '../../shared/guards/jwt-auth.guard';
@ApiTags('Price')
@Controller('price')
export class PriceController {
constructor(private readonly priceService: PriceService) {}
@Get('current')
@Public()
@ApiOperation({ summary: '获取当前价格信息' })
async getCurrentPrice() {
return this.priceService.getCurrentPrice();
}
@Get('latest')
@Public()
@ApiOperation({ summary: '获取最新价格快照' })
async getLatestSnapshot() {
const snapshot = await this.priceService.getLatestSnapshot();
if (!snapshot) {
return { message: 'No price snapshot available' };
}
return snapshot;
}
@Get('history')
@Public()
@ApiOperation({ summary: '获取价格历史' })
@ApiQuery({ name: 'startTime', required: true, type: String, description: 'ISO datetime' })
@ApiQuery({ name: 'endTime', required: true, type: String, description: 'ISO datetime' })
@ApiQuery({ name: 'limit', required: false, type: Number })
async getPriceHistory(
@Query('startTime') startTime: string,
@Query('endTime') endTime: string,
@Query('limit') limit?: number,
) {
return this.priceService.getPriceHistory(
new Date(startTime),
new Date(endTime),
limit ?? 1440,
);
}
}

View File

@ -3,11 +3,25 @@ import { ScheduleModule } from '@nestjs/schedule';
import { InfrastructureModule } from '../infrastructure/infrastructure.module';
import { OrderService } from './services/order.service';
import { TransferService } from './services/transfer.service';
import { PriceService } from './services/price.service';
import { BurnService } from './services/burn.service';
import { AssetService } from './services/asset.service';
import { OutboxScheduler } from './schedulers/outbox.scheduler';
import { BurnScheduler } from './schedulers/burn.scheduler';
@Module({
imports: [ScheduleModule.forRoot(), InfrastructureModule],
providers: [OrderService, TransferService, OutboxScheduler],
exports: [OrderService, TransferService],
providers: [
// Services
PriceService,
BurnService,
AssetService,
OrderService,
TransferService,
// Schedulers
OutboxScheduler,
BurnScheduler,
],
exports: [OrderService, TransferService, PriceService, BurnService, AssetService],
})
export class ApplicationModule {}

View File

@ -0,0 +1,95 @@
import { Injectable, Logger, OnModuleInit } from '@nestjs/common';
import { Cron, CronExpression } from '@nestjs/schedule';
import { BurnService } from '../services/burn.service';
import { PriceService } from '../services/price.service';
import { RedisService } from '../../infrastructure/redis/redis.service';
@Injectable()
export class BurnScheduler implements OnModuleInit {
private readonly logger = new Logger(BurnScheduler.name);
constructor(
private readonly burnService: BurnService,
private readonly priceService: PriceService,
private readonly redis: RedisService,
) {}
async onModuleInit() {
this.logger.log('Burn scheduler initialized');
// 初始化销毁系统
try {
await this.burnService.initialize();
this.logger.log('Burn system initialized');
} catch (error) {
this.logger.error('Failed to initialize burn system', error);
}
}
/**
*
* = 100亿 ÷ (365×4×1440) = 4756.468797564687
*/
@Cron(CronExpression.EVERY_MINUTE)
async executeMinuteBurn(): Promise<void> {
try {
const burnAmount = await this.burnService.executeMinuteBurn();
if (!burnAmount.isZero()) {
this.logger.debug(`Minute burn completed: ${burnAmount.toFixed(8)}`);
}
} catch (error) {
this.logger.error('Failed to execute minute burn', error);
}
}
/**
*
*/
@Cron(CronExpression.EVERY_MINUTE)
async createPriceSnapshot(): Promise<void> {
try {
await this.priceService.createSnapshot();
} catch (error) {
this.logger.error('Failed to create price snapshot', error);
}
}
/**
* 30
*/
@Cron('0 3 * * *') // 每天凌晨3点
async cleanupOldSnapshots(): Promise<void> {
const lockValue = await this.redis.acquireLock('snapshot:cleanup:lock', 300);
if (!lockValue) {
return;
}
try {
// 通过 PriceService 调用 repository 清理
this.logger.log('Starting cleanup of old price snapshots');
// 这里可以添加清理逻辑
} catch (error) {
this.logger.error('Failed to cleanup old snapshots', error);
} finally {
await this.redis.releaseLock('snapshot:cleanup:lock', lockValue);
}
}
/**
*
*/
@Cron('0 * * * *') // 每小时整点
async logBurnStatus(): Promise<void> {
try {
const status = await this.burnService.getBurnStatus();
this.logger.log(
`Burn status: burned=${status.totalBurned}, ` +
`remaining=${status.remainingBurn}, ` +
`progress=${status.burnProgress}%, ` +
`minuteRate=${status.minuteBurnRate}`,
);
} catch (error) {
this.logger.error('Failed to log burn status', error);
}
}
}

View File

@ -1 +1,2 @@
export * from './outbox.scheduler';
export * from './burn.scheduler';

View File

@ -0,0 +1,199 @@
import { Injectable, Logger } from '@nestjs/common';
import { TradingCalculatorService } from '../../domain/services/trading-calculator.service';
import { TradingAccountRepository } from '../../infrastructure/persistence/repositories/trading-account.repository';
import { BlackHoleRepository } from '../../infrastructure/persistence/repositories/black-hole.repository';
import { CirculationPoolRepository } from '../../infrastructure/persistence/repositories/circulation-pool.repository';
import { SharePoolRepository } from '../../infrastructure/persistence/repositories/share-pool.repository';
import { PriceService } from './price.service';
import { Money } from '../../domain/value-objects/money.vo';
import Decimal from 'decimal.js';
export interface AssetDisplay {
// 账户积分股余额
shareBalance: string;
// 账户现金余额
cashBalance: string;
// 冻结积分股
frozenShares: string;
// 冻结现金
frozenCash: string;
// 可用积分股
availableShares: string;
// 可用现金
availableCash: string;
// 当前价格
currentPrice: string;
// 销毁倍数
burnMultiplier: string;
// 有效积分股(含销毁加成)
effectiveShares: string;
// 资产显示值 = (账户积分股 + 账户积分股 × 倍数) × 积分股价
displayAssetValue: string;
// 每秒增长量(需要外部传入每日分配量)
assetGrowthPerSecond: string;
// 累计买入
totalBought: string;
// 累计卖出
totalSold: string;
}
@Injectable()
export class AssetService {
private readonly logger = new Logger(AssetService.name);
private readonly calculator = new TradingCalculatorService();
constructor(
private readonly tradingAccountRepository: TradingAccountRepository,
private readonly blackHoleRepository: BlackHoleRepository,
private readonly circulationPoolRepository: CirculationPoolRepository,
private readonly sharePoolRepository: SharePoolRepository,
private readonly priceService: PriceService,
) {}
/**
*
* = ( + × ) ×
*
* @param accountSequence
* @param dailyAllocation
*/
async getAssetDisplay(
accountSequence: string,
dailyAllocation?: string,
): Promise<AssetDisplay | null> {
const account = await this.tradingAccountRepository.findByAccountSequence(accountSequence);
if (!account) {
return null;
}
// 获取当前价格信息
const priceInfo = await this.priceService.getCurrentPrice();
const price = new Money(priceInfo.price);
const burnMultiplier = new Decimal(priceInfo.burnMultiplier);
// 计算有效积分股 = 余额 × (1 + 倍数)
const multiplierFactor = new Decimal(1).plus(burnMultiplier);
const effectiveShares = account.shareBalance.value.times(multiplierFactor);
// 计算资产显示值
const displayAssetValue = this.calculator.calculateDisplayAssetValue(
account.shareBalance,
burnMultiplier,
price,
);
// 计算每秒增长量
let assetGrowthPerSecond = Money.zero();
if (dailyAllocation) {
const dailyAmount = new Money(dailyAllocation);
assetGrowthPerSecond = this.calculator.calculateAssetGrowthPerSecond(dailyAmount);
}
return {
shareBalance: account.shareBalance.toFixed(8),
cashBalance: account.cashBalance.toFixed(8),
frozenShares: account.frozenShares.toFixed(8),
frozenCash: account.frozenCash.toFixed(8),
availableShares: account.availableShares.toFixed(8),
availableCash: account.availableCash.toFixed(8),
currentPrice: price.toFixed(18),
burnMultiplier: burnMultiplier.toFixed(18),
effectiveShares: new Money(effectiveShares).toFixed(8),
displayAssetValue: displayAssetValue.toFixed(8),
assetGrowthPerSecond: assetGrowthPerSecond.toFixed(18),
totalBought: account.totalBought.toFixed(8),
totalSold: account.totalSold.toFixed(8),
};
}
/**
*
* = ÷ 24 ÷ 60 ÷ 60
*/
calculateAssetGrowthPerSecond(dailyAllocation: string): string {
const dailyAmount = new Money(dailyAllocation);
const perSecond = this.calculator.calculateAssetGrowthPerSecond(dailyAmount);
return perSecond.toFixed(18);
}
/**
*
* = ( + ) ×
*/
async estimateSellProceeds(sellQuantity: string): Promise<{
sellQuantity: string;
burnQuantity: string;
effectiveQuantity: string;
price: string;
proceeds: string;
burnMultiplier: string;
}> {
const quantity = new Money(sellQuantity);
const result = await this.priceService.calculateSellAmount(quantity);
return {
sellQuantity: quantity.toFixed(8),
burnQuantity: result.burnQuantity.toFixed(8),
effectiveQuantity: result.effectiveQuantity.toFixed(8),
price: result.price.toFixed(18),
proceeds: result.amount.toFixed(8),
burnMultiplier: (await this.priceService.getCurrentBurnMultiplier()).toFixed(18),
};
}
/**
*
*/
async getMarketOverview(): Promise<{
price: string;
greenPoints: string;
blackHoleAmount: string;
circulationPool: string;
effectiveDenominator: string;
burnMultiplier: string;
totalShares: string;
burnTarget: string;
burnProgress: string;
}> {
const [sharePool, blackHole, circulationPool] = await Promise.all([
this.sharePoolRepository.getPool(),
this.blackHoleRepository.getBlackHole(),
this.circulationPoolRepository.getPool(),
]);
const greenPoints = sharePool?.greenPoints || Money.zero();
const blackHoleAmount = blackHole?.totalBurned || Money.zero();
const circulationPoolAmount = circulationPool?.totalShares || Money.zero();
// 计算价格
const price = this.calculator.calculatePrice(greenPoints, blackHoleAmount, circulationPoolAmount);
// 计算有效分母
const effectiveDenominator = this.calculator.calculateEffectiveDenominator(
blackHoleAmount,
circulationPoolAmount,
);
// 计算销毁倍数
const burnMultiplier = this.calculator.calculateSellBurnMultiplier(
blackHoleAmount,
circulationPoolAmount,
);
// 销毁进度
const targetBurn = blackHole?.targetBurn || new Money(TradingCalculatorService.BURN_TARGET);
const burnProgress = blackHoleAmount.value.dividedBy(targetBurn.value).times(100);
return {
price: price.toFixed(18),
greenPoints: greenPoints.toFixed(8),
blackHoleAmount: blackHoleAmount.toFixed(8),
circulationPool: circulationPoolAmount.toFixed(8),
effectiveDenominator: effectiveDenominator.toFixed(8),
burnMultiplier: burnMultiplier.toFixed(18),
totalShares: TradingCalculatorService.TOTAL_SHARES.toFixed(8),
burnTarget: targetBurn.toFixed(8),
burnProgress: burnProgress.toFixed(4),
};
}
}

View File

@ -0,0 +1,365 @@
import { Injectable, Logger } from '@nestjs/common';
import { TradingCalculatorService } from '../../domain/services/trading-calculator.service';
import { BlackHoleRepository } from '../../infrastructure/persistence/repositories/black-hole.repository';
import { CirculationPoolRepository } from '../../infrastructure/persistence/repositories/circulation-pool.repository';
import { TradingConfigRepository } from '../../infrastructure/persistence/repositories/trading-config.repository';
import { OutboxRepository } from '../../infrastructure/persistence/repositories/outbox.repository';
import { RedisService } from '../../infrastructure/redis/redis.service';
import { Money } from '../../domain/value-objects/money.vo';
import Decimal from 'decimal.js';
import {
TradingEventTypes,
TradingTopics,
BurnExecutedPayload,
MinuteBurnExecutedPayload,
} from '../../domain/events/trading.events';
export interface BurnStatus {
totalBurned: string;
targetBurn: string;
remainingBurn: string;
burnProgress: string; // 百分比
minuteBurnRate: string;
remainingMinutes: number;
lastBurnMinute: Date | null;
}
export interface SellBurnResult {
burnQuantity: Money;
burnMultiplier: Decimal;
newMinuteBurnRate: Money;
}
@Injectable()
export class BurnService {
private readonly logger = new Logger(BurnService.name);
private readonly calculator = new TradingCalculatorService();
constructor(
private readonly blackHoleRepository: BlackHoleRepository,
private readonly circulationPoolRepository: CirculationPoolRepository,
private readonly tradingConfigRepository: TradingConfigRepository,
private readonly outboxRepository: OutboxRepository,
private readonly redis: RedisService,
) {}
/**
*
*/
async getBurnStatus(): Promise<BurnStatus> {
const [blackHole, config] = await Promise.all([
this.blackHoleRepository.getBlackHole(),
this.tradingConfigRepository.getConfig(),
]);
const totalBurned = blackHole?.totalBurned || Money.zero();
const targetBurn = blackHole?.targetBurn || new Money(TradingCalculatorService.BURN_TARGET);
const remainingBurn = blackHole?.remainingBurn || targetBurn;
// 计算进度百分比
const progress = totalBurned.value.dividedBy(targetBurn.value).times(100);
// 计算剩余分钟数
const activatedAt = config?.activatedAt || new Date();
const remainingMinutes = this.calculator.calculateRemainingMinutes(activatedAt);
return {
totalBurned: totalBurned.toFixed(8),
targetBurn: targetBurn.toFixed(8),
remainingBurn: remainingBurn.toFixed(8),
burnProgress: progress.toFixed(4),
minuteBurnRate: (config?.minuteBurnRate || Money.zero()).toFixed(18),
remainingMinutes,
lastBurnMinute: blackHole?.lastBurnMinute || null,
};
}
/**
*
*/
async executeMinuteBurn(): Promise<Money> {
const lockValue = await this.redis.acquireLock('burn:minute:lock', 55);
if (!lockValue) {
return Money.zero();
}
try {
const config = await this.tradingConfigRepository.getConfig();
if (!config || !config.isActive) {
return Money.zero();
}
const blackHole = await this.blackHoleRepository.getBlackHole();
if (!blackHole) {
return Money.zero();
}
// 检查是否已完成销毁目标
if (blackHole.remainingBurn.isZero()) {
return Money.zero();
}
const currentMinute = new Date();
currentMinute.setSeconds(0, 0);
// 检查是否已处理过这一分钟
const processedKey = `burn:processed:${currentMinute.getTime()}`;
if (await this.redis.get(processedKey)) {
return Money.zero();
}
// 使用当前配置的每分钟销毁率
let burnAmount = config.minuteBurnRate;
// 确保不超过剩余待销毁量
if (burnAmount.isGreaterThan(blackHole.remainingBurn)) {
burnAmount = blackHole.remainingBurn;
}
if (burnAmount.isZero()) {
return Money.zero();
}
// 记录销毁
const burnRecord = await this.blackHoleRepository.recordMinuteBurn(currentMinute, burnAmount);
// 标记已处理
await this.redis.set(processedKey, '1', 120);
this.logger.log(`Minute burn executed: ${burnAmount.toFixed(8)}`);
// 发布每分钟销毁事件
await this.publishMinuteBurnEvent(
burnRecord.id,
currentMinute,
burnAmount,
blackHole.totalBurned.add(burnAmount),
blackHole.remainingBurn.subtract(burnAmount),
);
return burnAmount;
} catch (error) {
this.logger.error('Failed to execute minute burn', error);
return Money.zero();
} finally {
await this.redis.releaseLock('burn:minute:lock', lockValue);
}
}
/**
*
* = ×
*
*/
async executeSellBurn(
sellQuantity: Money,
accountSeq: string,
orderNo: string,
): Promise<SellBurnResult> {
const [blackHole, circulationPool, config] = await Promise.all([
this.blackHoleRepository.getBlackHole(),
this.circulationPoolRepository.getPool(),
this.tradingConfigRepository.getConfig(),
]);
if (!blackHole || !config) {
throw new Error('Trading system not initialized');
}
const blackHoleAmount = blackHole.totalBurned;
const circulationPoolAmount = circulationPool?.totalShares || Money.zero();
// 计算销毁倍数
const burnMultiplier = this.calculator.calculateSellBurnMultiplier(
blackHoleAmount,
circulationPoolAmount,
);
// 计算销毁量
const burnQuantity = this.calculator.calculateSellBurnAmount(sellQuantity, burnMultiplier);
// 确保销毁量不超过剩余待销毁量
const actualBurnQuantity = burnQuantity.isGreaterThan(blackHole.remainingBurn)
? blackHole.remainingBurn
: burnQuantity;
if (!actualBurnQuantity.isZero()) {
// 记录卖出销毁
const burnMinute = new Date();
burnMinute.setSeconds(0, 0);
const burnRecord = await this.blackHoleRepository.recordSellBurn(
burnMinute,
actualBurnQuantity,
accountSeq,
orderNo,
);
// 重新计算每分钟销毁量
const newBlackHoleAmount = blackHoleAmount.add(actualBurnQuantity);
const remainingMinutes = this.calculator.calculateRemainingMinutes(
config.activatedAt || new Date(),
);
const newMinuteBurnRate = this.calculator.calculateMinuteBurnRate(
newBlackHoleAmount,
remainingMinutes,
);
// 更新配置中的每分钟销毁率
await this.tradingConfigRepository.updateMinuteBurnRate(newMinuteBurnRate);
this.logger.log(
`Sell burn executed: quantity=${actualBurnQuantity.toFixed(8)}, ` +
`multiplier=${burnMultiplier.toFixed(8)}, newRate=${newMinuteBurnRate.toFixed(18)}`,
);
// 发布卖出销毁事件
await this.publishSellBurnEvent(
burnRecord.id,
accountSeq,
orderNo,
actualBurnQuantity,
burnMultiplier,
blackHole.remainingBurn.subtract(actualBurnQuantity),
);
return {
burnQuantity: actualBurnQuantity,
burnMultiplier,
newMinuteBurnRate,
};
}
return {
burnQuantity: Money.zero(),
burnMultiplier,
newMinuteBurnRate: config.minuteBurnRate,
};
}
/**
*
*/
async initialize(): Promise<void> {
const [existingConfig, existingBlackHole] = await Promise.all([
this.tradingConfigRepository.getConfig(),
this.blackHoleRepository.getBlackHole(),
]);
if (!existingConfig) {
await this.tradingConfigRepository.initializeConfig();
this.logger.log('Trading config initialized');
}
if (!existingBlackHole) {
await this.blackHoleRepository.initializeBlackHole(
new Money(TradingCalculatorService.BURN_TARGET),
);
this.logger.log('Black hole initialized');
}
}
/**
*
*/
async getBurnRecords(
page: number,
pageSize: number,
sourceType?: 'MINUTE_BURN' | 'SELL_BURN',
): Promise<{
data: any[];
total: number;
}> {
const result = await this.blackHoleRepository.getBurnRecords(page, pageSize, sourceType);
return {
data: result.data.map((r) => ({
id: r.id,
burnMinute: r.burnMinute,
burnAmount: r.burnAmount.toFixed(8),
remainingTarget: r.remainingTarget.toFixed(8),
sourceType: r.sourceType,
sourceAccountSeq: r.sourceAccountSeq,
sourceOrderNo: r.sourceOrderNo,
memo: r.memo,
createdAt: r.createdAt,
})),
total: result.total,
};
}
// ==================== 事件发布方法 ====================
/**
*
*/
private async publishMinuteBurnEvent(
burnRecordId: string,
burnMinute: Date,
burnAmount: Money,
totalBurned: Money,
remainingTarget: Money,
): Promise<void> {
try {
const payload: MinuteBurnExecutedPayload = {
burnRecordId,
burnMinute: burnMinute.toISOString(),
burnAmount: burnAmount.toString(),
totalBurned: totalBurned.toString(),
remainingTarget: remainingTarget.toString(),
executedAt: new Date().toISOString(),
};
await this.outboxRepository.create({
aggregateType: 'BurnRecord',
aggregateId: burnRecordId,
eventType: TradingEventTypes.MINUTE_BURN_EXECUTED,
payload,
topic: TradingTopics.BURNS,
key: 'minute-burn',
});
this.logger.debug(`Published MinuteBurnExecuted event: ${burnAmount.toFixed(8)}`);
} catch (error) {
this.logger.error(`Failed to publish MinuteBurnExecuted event: ${error}`);
}
}
/**
*
*/
private async publishSellBurnEvent(
burnRecordId: string,
accountSeq: string,
orderNo: string,
burnAmount: Money,
burnMultiplier: Decimal,
remainingTarget: Money,
): Promise<void> {
try {
const payload: BurnExecutedPayload = {
burnRecordId,
sourceType: 'SELL',
sourceAccountSeq: accountSeq,
sourceOrderNo: orderNo,
burnAmount: burnAmount.toString(),
burnMultiplier: burnMultiplier.toString(),
remainingTarget: remainingTarget.toString(),
executedAt: new Date().toISOString(),
};
await this.outboxRepository.create({
aggregateType: 'BurnRecord',
aggregateId: burnRecordId,
eventType: TradingEventTypes.BURN_EXECUTED,
payload,
topic: TradingTopics.BURNS,
key: accountSeq,
});
this.logger.debug(`Published BurnExecuted event for account ${accountSeq}`);
} catch (error) {
this.logger.error(`Failed to publish BurnExecuted event: ${error}`);
}
}
}

View File

@ -1,12 +1,23 @@
import { Injectable, Logger } from '@nestjs/common';
import { OrderRepository } from '../../infrastructure/persistence/repositories/order.repository';
import { TradingAccountRepository } from '../../infrastructure/persistence/repositories/trading-account.repository';
import { CirculationPoolRepository } from '../../infrastructure/persistence/repositories/circulation-pool.repository';
import { OutboxRepository } from '../../infrastructure/persistence/repositories/outbox.repository';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
import { RedisService } from '../../infrastructure/redis/redis.service';
import { OrderAggregate, OrderType, OrderStatus } from '../../domain/aggregates/order.aggregate';
import { TradingAccountAggregate } from '../../domain/aggregates/trading-account.aggregate';
import { MatchingEngineService } from '../../domain/services/matching-engine.service';
import { Money } from '../../domain/value-objects/money.vo';
import { BurnService } from './burn.service';
import { PriceService } from './price.service';
import {
TradingEventTypes,
TradingTopics,
OrderCreatedPayload,
OrderCancelledPayload,
TradeExecutedPayload,
} from '../../domain/events/trading.events';
@Injectable()
export class OrderService {
@ -16,8 +27,12 @@ export class OrderService {
constructor(
private readonly orderRepository: OrderRepository,
private readonly accountRepository: TradingAccountRepository,
private readonly circulationPoolRepository: CirculationPoolRepository,
private readonly outboxRepository: OutboxRepository,
private readonly prisma: PrismaService,
private readonly redis: RedisService,
private readonly burnService: BurnService,
private readonly priceService: PriceService,
) {}
async createOrder(
@ -70,6 +85,9 @@ export class OrderService {
const orderId = await this.orderRepository.save(order);
await this.accountRepository.save(account);
// 发布订单创建事件
await this.publishOrderCreatedEvent(orderId, order);
// 尝试撮合
await this.tryMatch(order);
@ -113,6 +131,9 @@ export class OrderService {
await this.orderRepository.save(order);
await this.accountRepository.save(account);
// 发布订单取消事件
await this.publishOrderCancelledEvent(order);
}
private async tryMatch(incomingOrder: OrderAggregate): Promise<void> {
@ -126,7 +147,36 @@ export class OrderService {
const matches = this.matchingEngine.findMatchingOrders(incomingOrder, orderBook);
for (const match of matches) {
// 保存成交记录
const tradeQuantity = match.trade.quantity;
let burnQuantity = Money.zero();
let effectiveQuantity = tradeQuantity;
// 如果是卖出成交,执行销毁逻辑
// 卖出的销毁量 = 卖出积分股 × 倍数
// 卖出交易额 = (卖出量 + 卖出销毁量) × 积分股价
if (match.sellOrder) {
try {
const burnResult = await this.burnService.executeSellBurn(
tradeQuantity,
match.sellOrder.accountSequence,
match.sellOrder.orderNo,
);
burnQuantity = burnResult.burnQuantity;
effectiveQuantity = new Money(tradeQuantity.value.plus(burnQuantity.value));
this.logger.log(
`Sell burn executed: sellQty=${tradeQuantity.toFixed(8)}, ` +
`burnQty=${burnQuantity.toFixed(8)}, effectiveQty=${effectiveQuantity.toFixed(8)}`,
);
} catch (error) {
this.logger.warn(`Sell burn failed, continuing without burn: ${error}`);
}
}
// 计算交易额 = 有效数量 × 价格
const tradeAmount = new Money(effectiveQuantity.value.times(match.trade.price.value));
// 保存成交记录(包含销毁信息)
await this.prisma.trade.create({
data: {
tradeNo: match.trade.tradeNo,
@ -135,31 +185,58 @@ export class OrderService {
buyerSequence: match.buyOrder.accountSequence,
sellerSequence: match.sellOrder.accountSequence,
price: match.trade.price.value,
quantity: match.trade.quantity.value,
amount: match.trade.amount.value,
quantity: tradeQuantity.value,
burnQuantity: burnQuantity.value,
effectiveQty: effectiveQuantity.value,
amount: tradeAmount.value,
},
});
// 更新订单
// 卖出的积分股进入流通池
try {
await this.circulationPoolRepository.addSharesFromSell(
tradeQuantity,
match.sellOrder.accountSequence,
match.sellOrder.id!,
`卖出成交, 交易号${match.trade.tradeNo}`,
);
} catch (error) {
this.logger.warn(`Failed to add shares to circulation pool: ${error}`);
}
// 更新订单(包含销毁信息)
await this.orderRepository.save(match.buyOrder);
await this.orderRepository.save(match.sellOrder);
await this.orderRepository.saveWithBurnInfo(match.sellOrder, burnQuantity, effectiveQuantity);
// 更新买方账户
const buyerAccount = await this.accountRepository.findByAccountSequence(match.buyOrder.accountSequence);
if (buyerAccount) {
buyerAccount.executeBuy(match.trade.quantity, match.trade.amount, match.trade.tradeNo);
buyerAccount.executeBuy(tradeQuantity, tradeAmount, match.trade.tradeNo);
await this.accountRepository.save(buyerAccount);
}
// 更新卖方账户
// 更新卖方账户(获得的是有效交易额)
const sellerAccount = await this.accountRepository.findByAccountSequence(match.sellOrder.accountSequence);
if (sellerAccount) {
sellerAccount.executeSell(match.trade.quantity, match.trade.amount, match.trade.tradeNo);
sellerAccount.executeSell(tradeQuantity, tradeAmount, match.trade.tradeNo);
await this.accountRepository.save(sellerAccount);
}
this.logger.log(
`Trade executed: ${match.trade.tradeNo}, price=${match.trade.price}, qty=${match.trade.quantity}`,
`Trade executed: ${match.trade.tradeNo}, price=${match.trade.price.toFixed(8)}, ` +
`qty=${tradeQuantity.toFixed(8)}, burn=${burnQuantity.toFixed(8)}, amount=${tradeAmount.toFixed(8)}`,
);
// 发布成交事件
await this.publishTradeExecutedEvent(
match.trade.tradeNo,
match.buyOrder,
match.sellOrder,
match.trade.price,
tradeQuantity,
tradeAmount,
burnQuantity,
effectiveQuantity,
);
}
} finally {
@ -172,4 +249,116 @@ export class OrderService {
const random = Math.random().toString(36).substring(2, 8);
return `O${timestamp}${random}`.toUpperCase();
}
// ==================== 事件发布方法 ====================
/**
*
*/
private async publishOrderCreatedEvent(orderId: string, order: OrderAggregate): Promise<void> {
try {
const payload: OrderCreatedPayload = {
orderId,
orderNo: order.orderNo,
accountSequence: order.accountSequence,
type: order.type,
price: order.price.toString(),
quantity: order.quantity.toString(),
createdAt: new Date().toISOString(),
};
await this.outboxRepository.create({
aggregateType: 'Order',
aggregateId: orderId,
eventType: TradingEventTypes.ORDER_CREATED,
payload,
topic: TradingTopics.ORDERS,
key: order.accountSequence,
});
this.logger.debug(`Published OrderCreated event for order ${order.orderNo}`);
} catch (error) {
this.logger.error(`Failed to publish OrderCreated event: ${error}`);
}
}
/**
*
*/
private async publishOrderCancelledEvent(order: OrderAggregate): Promise<void> {
try {
const payload: OrderCancelledPayload = {
orderId: order.id!,
orderNo: order.orderNo,
accountSequence: order.accountSequence,
type: order.type,
cancelledQuantity: order.remainingQuantity.toString(),
cancelledAt: new Date().toISOString(),
};
await this.outboxRepository.create({
aggregateType: 'Order',
aggregateId: order.id!,
eventType: TradingEventTypes.ORDER_CANCELLED,
payload,
topic: TradingTopics.ORDERS,
key: order.accountSequence,
});
this.logger.debug(`Published OrderCancelled event for order ${order.orderNo}`);
} catch (error) {
this.logger.error(`Failed to publish OrderCancelled event: ${error}`);
}
}
/**
*
*/
private async publishTradeExecutedEvent(
tradeNo: string,
buyOrder: OrderAggregate,
sellOrder: OrderAggregate,
price: Money,
quantity: Money,
amount: Money,
burnQuantity: Money,
effectiveQuantity: Money,
): Promise<void> {
try {
// 使用 tradeNo 查找刚创建的 trade 获取 id
const trade = await this.prisma.trade.findUnique({ where: { tradeNo } });
if (!trade) {
this.logger.warn(`Trade not found for event publishing: ${tradeNo}`);
return;
}
const payload: TradeExecutedPayload = {
tradeId: trade.id,
tradeNo,
buyOrderId: buyOrder.id!,
sellOrderId: sellOrder.id!,
buyerSequence: buyOrder.accountSequence,
sellerSequence: sellOrder.accountSequence,
price: price.toString(),
quantity: quantity.toString(),
amount: amount.toString(),
burnQuantity: burnQuantity.toString(),
effectiveQuantity: effectiveQuantity.toString(),
createdAt: new Date().toISOString(),
};
await this.outboxRepository.create({
aggregateType: 'Trade',
aggregateId: trade.id,
eventType: TradingEventTypes.TRADE_EXECUTED,
payload,
topic: TradingTopics.TRADES,
key: tradeNo,
});
this.logger.debug(`Published TradeExecuted event for trade ${tradeNo}`);
} catch (error) {
this.logger.error(`Failed to publish TradeExecuted event: ${error}`);
}
}
}

View File

@ -0,0 +1,229 @@
import { Injectable, Logger } from '@nestjs/common';
import { TradingCalculatorService } from '../../domain/services/trading-calculator.service';
import { BlackHoleRepository } from '../../infrastructure/persistence/repositories/black-hole.repository';
import { SharePoolRepository } from '../../infrastructure/persistence/repositories/share-pool.repository';
import { CirculationPoolRepository } from '../../infrastructure/persistence/repositories/circulation-pool.repository';
import { PriceSnapshotRepository } from '../../infrastructure/persistence/repositories/price-snapshot.repository';
import { TradingConfigRepository } from '../../infrastructure/persistence/repositories/trading-config.repository';
import { Money } from '../../domain/value-objects/money.vo';
import Decimal from 'decimal.js';
export interface PriceInfo {
price: string;
greenPoints: string;
blackHoleAmount: string;
circulationPool: string;
effectiveDenominator: string;
burnMultiplier: string;
minuteBurnRate: string;
snapshotTime: Date;
}
@Injectable()
export class PriceService {
private readonly logger = new Logger(PriceService.name);
private readonly calculator = new TradingCalculatorService();
constructor(
private readonly blackHoleRepository: BlackHoleRepository,
private readonly sharePoolRepository: SharePoolRepository,
private readonly circulationPoolRepository: CirculationPoolRepository,
private readonly priceSnapshotRepository: PriceSnapshotRepository,
private readonly tradingConfigRepository: TradingConfigRepository,
) {}
/**
*
*/
async getCurrentPrice(): Promise<PriceInfo> {
const [sharePool, blackHole, circulationPool, config] = await Promise.all([
this.sharePoolRepository.getPool(),
this.blackHoleRepository.getBlackHole(),
this.circulationPoolRepository.getPool(),
this.tradingConfigRepository.getConfig(),
]);
const greenPoints = sharePool?.greenPoints || Money.zero();
const blackHoleAmount = blackHole?.totalBurned || Money.zero();
const circulationPoolAmount = circulationPool?.totalShares || Money.zero();
// 计算价格
const price = this.calculator.calculatePrice(greenPoints, blackHoleAmount, circulationPoolAmount);
// 计算有效分母
const effectiveDenominator = this.calculator.calculateEffectiveDenominator(
blackHoleAmount,
circulationPoolAmount,
);
// 计算销毁倍数
const burnMultiplier = this.calculator.calculateSellBurnMultiplier(
blackHoleAmount,
circulationPoolAmount,
);
// 获取当前每分钟销毁率
const minuteBurnRate = config?.minuteBurnRate || Money.zero();
return {
price: price.toFixed(18),
greenPoints: greenPoints.toFixed(8),
blackHoleAmount: blackHoleAmount.toFixed(8),
circulationPool: circulationPoolAmount.toFixed(8),
effectiveDenominator: effectiveDenominator.toFixed(8),
burnMultiplier: burnMultiplier.toFixed(18),
minuteBurnRate: minuteBurnRate.toFixed(18),
snapshotTime: new Date(),
};
}
/**
*
*/
async getCurrentBurnMultiplier(): Promise<Decimal> {
const [blackHole, circulationPool] = await Promise.all([
this.blackHoleRepository.getBlackHole(),
this.circulationPoolRepository.getPool(),
]);
const blackHoleAmount = blackHole?.totalBurned || Money.zero();
const circulationPoolAmount = circulationPool?.totalShares || Money.zero();
return this.calculator.calculateSellBurnMultiplier(blackHoleAmount, circulationPoolAmount);
}
/**
*
*/
async calculateSellBurn(sellQuantity: Money): Promise<{
burnQuantity: Money;
burnMultiplier: Decimal;
effectiveQuantity: Money;
}> {
const burnMultiplier = await this.getCurrentBurnMultiplier();
const burnQuantity = this.calculator.calculateSellBurnAmount(sellQuantity, burnMultiplier);
const effectiveQuantity = new Money(sellQuantity.value.plus(burnQuantity.value));
return {
burnQuantity,
burnMultiplier,
effectiveQuantity,
};
}
/**
*
*/
async calculateSellAmount(sellQuantity: Money): Promise<{
amount: Money;
burnQuantity: Money;
effectiveQuantity: Money;
price: Money;
}> {
const priceInfo = await this.getCurrentPrice();
const price = new Money(priceInfo.price);
const { burnQuantity, effectiveQuantity } = await this.calculateSellBurn(sellQuantity);
const amount = this.calculator.calculateSellAmount(sellQuantity, burnQuantity, price);
return {
amount,
burnQuantity,
effectiveQuantity,
price,
};
}
/**
*
*/
async createSnapshot(): Promise<void> {
try {
const [sharePool, blackHole, circulationPool, config] = await Promise.all([
this.sharePoolRepository.getPool(),
this.blackHoleRepository.getBlackHole(),
this.circulationPoolRepository.getPool(),
this.tradingConfigRepository.getConfig(),
]);
const greenPoints = sharePool?.greenPoints || Money.zero();
const blackHoleAmount = blackHole?.totalBurned || Money.zero();
const circulationPoolAmount = circulationPool?.totalShares || Money.zero();
const price = this.calculator.calculatePrice(greenPoints, blackHoleAmount, circulationPoolAmount);
const effectiveDenominator = this.calculator.calculateEffectiveDenominator(
blackHoleAmount,
circulationPoolAmount,
);
const minuteBurnRate = config?.minuteBurnRate || Money.zero();
const snapshotTime = new Date();
snapshotTime.setSeconds(0, 0);
await this.priceSnapshotRepository.createSnapshot({
snapshotTime,
price,
greenPoints,
blackHoleAmount,
circulationPool: circulationPoolAmount,
effectiveDenominator,
minuteBurnRate,
});
this.logger.debug(`Price snapshot created: ${price.toFixed(18)}`);
} catch (error) {
this.logger.error('Failed to create price snapshot', error);
}
}
/**
*
*/
async getPriceHistory(
startTime: Date,
endTime: Date,
limit: number = 1440,
): Promise<
Array<{
time: Date;
price: string;
greenPoints: string;
blackHoleAmount: string;
circulationPool: string;
}>
> {
const snapshots = await this.priceSnapshotRepository.getPriceHistory(startTime, endTime, limit);
return snapshots.map((s) => ({
time: s.snapshotTime,
price: s.price.toFixed(18),
greenPoints: s.greenPoints.toFixed(8),
blackHoleAmount: s.blackHoleAmount.toFixed(8),
circulationPool: s.circulationPool.toFixed(8),
}));
}
/**
*
*/
async getLatestSnapshot(): Promise<PriceInfo | null> {
const snapshot = await this.priceSnapshotRepository.getLatestSnapshot();
if (!snapshot) {
return null;
}
const burnMultiplier = await this.getCurrentBurnMultiplier();
return {
price: snapshot.price.toFixed(18),
greenPoints: snapshot.greenPoints.toFixed(8),
blackHoleAmount: snapshot.blackHoleAmount.toFixed(8),
circulationPool: snapshot.circulationPool.toFixed(8),
effectiveDenominator: snapshot.effectiveDenominator.toFixed(8),
burnMultiplier: burnMultiplier.toFixed(18),
minuteBurnRate: snapshot.minuteBurnRate.toFixed(18),
snapshotTime: snapshot.snapshotTime,
};
}
}

View File

@ -0,0 +1,2 @@
// Trading Service Event Types
export * from './trading.events';

View File

@ -0,0 +1,224 @@
/**
* Trading Service
* Outbox Kafka
*/
// ==================== 事件类型常量 ====================
export const TradingEventTypes = {
// 订单事件
ORDER_CREATED: 'order.created',
ORDER_CANCELLED: 'order.cancelled',
ORDER_COMPLETED: 'order.completed',
// 成交事件
TRADE_EXECUTED: 'trade.executed',
// 转账事件
TRANSFER_INITIATED: 'transfer.initiated',
TRANSFER_COMPLETED: 'transfer.completed',
TRANSFER_FAILED: 'transfer.failed',
// 销毁事件
BURN_EXECUTED: 'burn.executed',
MINUTE_BURN_EXECUTED: 'burn.minute-executed',
// 价格事件
PRICE_UPDATED: 'price.updated',
// 账户事件
TRADING_ACCOUNT_CREATED: 'trading-account.created',
} as const;
export type TradingEventType =
(typeof TradingEventTypes)[keyof typeof TradingEventTypes];
// ==================== Kafka Topic 常量 ====================
export const TradingTopics = {
ORDERS: 'trading.orders',
TRADES: 'trading.trades',
TRANSFERS: 'trading.transfers',
BURNS: 'trading.burns',
PRICES: 'trading.prices',
ACCOUNTS: 'trading.accounts',
} as const;
// ==================== 事件 Payload 类型 ====================
/**
*
*/
export interface OrderCreatedPayload {
orderId: string;
orderNo: string;
accountSequence: string;
type: 'BUY' | 'SELL';
price: string;
quantity: string;
createdAt: string;
}
/**
*
*/
export interface OrderCancelledPayload {
orderId: string;
orderNo: string;
accountSequence: string;
type: 'BUY' | 'SELL';
cancelledQuantity: string;
cancelledAt: string;
}
/**
*
*/
export interface OrderCompletedPayload {
orderId: string;
orderNo: string;
accountSequence: string;
type: 'BUY' | 'SELL';
filledQuantity: string;
averagePrice: string;
totalAmount: string;
completedAt: string;
}
/**
*
*/
export interface TradeExecutedPayload {
tradeId: string;
tradeNo: string;
buyOrderId: string;
sellOrderId: string;
buyerSequence: string;
sellerSequence: string;
price: string;
quantity: string;
amount: string;
burnQuantity: string;
effectiveQuantity: string;
createdAt: string;
}
/**
*
*/
export interface TransferInitiatedPayload {
transferId: string;
transferNo: string;
accountSequence: string;
direction: 'IN' | 'OUT';
amount: string;
initiatedAt: string;
}
/**
*
*/
export interface TransferCompletedPayload {
transferId: string;
transferNo: string;
accountSequence: string;
direction: 'IN' | 'OUT';
amount: string;
miningTxId?: string;
completedAt: string;
}
/**
*
*/
export interface TransferFailedPayload {
transferId: string;
transferNo: string;
accountSequence: string;
direction: 'IN' | 'OUT';
amount: string;
errorMessage: string;
failedAt: string;
}
/**
*
*/
export interface BurnExecutedPayload {
burnRecordId: string;
sourceType: 'SELL' | 'SCHEDULED';
sourceAccountSeq?: string;
sourceOrderNo?: string;
burnAmount: string;
burnMultiplier?: string;
remainingTarget: string;
executedAt: string;
}
/**
*
*/
export interface MinuteBurnExecutedPayload {
burnRecordId: string;
burnMinute: string;
burnAmount: string;
totalBurned: string;
remainingTarget: string;
executedAt: string;
}
/**
*
*/
export interface PriceUpdatedPayload {
snapshotId: string;
price: string;
greenPoints: string;
blackHoleAmount: string;
circulationPool: string;
effectiveDenominator: string;
minuteBurnRate: string;
snapshotTime: string;
}
/**
*
*/
export interface TradingAccountCreatedPayload {
accountId: string;
accountSequence: string;
createdAt: string;
}
// ==================== 事件基类 ====================
export interface TradingEvent<T = unknown> {
eventId: string;
eventType: TradingEventType;
aggregateType: string;
aggregateId: string;
payload: T;
timestamp: string;
version: number;
}
// ==================== 辅助函数 ====================
/**
*
*/
export function createTradingEvent<T>(
eventType: TradingEventType,
aggregateType: string,
aggregateId: string,
payload: T,
): Omit<TradingEvent<T>, 'eventId'> {
return {
eventType,
aggregateType,
aggregateId,
payload,
timestamp: new Date().toISOString(),
version: 1,
};
}

View File

@ -0,0 +1,241 @@
import Decimal from 'decimal.js';
import { Money } from '../value-objects/money.vo';
/**
*
*
*
* 1. = 100亿 ÷ (365×4×1440) = 4756.468797564687
* 2. = 绿 ÷ (100.02亿 - - )
* 3. = (100亿 - ) ÷ (200 - )
* 4. = ×
* 5. = ( + ) ×
* 6. = ( + × ) ×
* 7. = ÷ 24 ÷ 60 ÷ 60
*/
export class TradingCalculatorService {
// 总积分股数量: 100.02B
static readonly TOTAL_SHARES = new Decimal('100020000000');
// 目标销毁量: 100亿 (4年销毁完)
static readonly BURN_TARGET = new Decimal('10000000000');
// 销毁周期: 4年的分钟数
static readonly BURN_PERIOD_MINUTES = 365 * 4 * 1440; // 2102400
// 流通池目标量: 200万
static readonly CIRCULATION_POOL_TARGET = new Decimal('2000000');
// 基础每分钟销毁量: 100亿 ÷ (365×4×1440)
static readonly BASE_MINUTE_BURN_RATE = TradingCalculatorService.BURN_TARGET.dividedBy(
TradingCalculatorService.BURN_PERIOD_MINUTES,
);
/**
*
* = 绿() ÷ ( - - )
*
* @param greenPoints 绿
* @param blackHoleAmount
* @param circulationPoolAmount
* @returns
*/
calculatePrice(
greenPoints: Money,
blackHoleAmount: Money,
circulationPoolAmount: Money,
): Money {
// 有效分母 = 100.02B - 黑洞 - 流通池
const effectiveDenominator = TradingCalculatorService.TOTAL_SHARES
.minus(blackHoleAmount.value)
.minus(circulationPoolAmount.value);
if (effectiveDenominator.isZero() || effectiveDenominator.isNegative()) {
return Money.zero();
}
// 价格 = 绿积分 / 有效分母
const price = greenPoints.value.dividedBy(effectiveDenominator);
return new Money(price);
}
/**
*
* = - -
*/
calculateEffectiveDenominator(
blackHoleAmount: Money,
circulationPoolAmount: Money,
): Money {
const denominator = TradingCalculatorService.TOTAL_SHARES
.minus(blackHoleAmount.value)
.minus(circulationPoolAmount.value);
if (denominator.isNegative()) {
return Money.zero();
}
return new Money(denominator);
}
/**
*
* = (100亿 - ) ÷ (200 - )
*
*
*
* @param blackHoleAmount
* @param circulationPoolAmount
* @returns
*/
calculateSellBurnMultiplier(
blackHoleAmount: Money,
circulationPoolAmount: Money,
): Decimal {
// 分子 = 100亿 - 黑洞销毁量
const numerator = TradingCalculatorService.BURN_TARGET.minus(blackHoleAmount.value);
// 分母 = 200万 - 流通池量
const denominator = TradingCalculatorService.CIRCULATION_POOL_TARGET.minus(
circulationPoolAmount.value,
);
// 防止除以零或负数
if (denominator.isZero() || denominator.isNegative()) {
// 当流通池已满时,销毁倍数设为最大合理值
return new Decimal('5'); // 或其他业务定义的最大倍数
}
if (numerator.isNegative()) {
// 当黑洞已满时,不再销毁
return new Decimal('0');
}
return numerator.dividedBy(denominator);
}
/**
*
* = ×
*
* @param sellQuantity
* @param burnMultiplier
* @returns
*/
calculateSellBurnAmount(sellQuantity: Money, burnMultiplier: Decimal): Money {
const burnAmount = sellQuantity.value.times(burnMultiplier);
return new Money(burnAmount);
}
/**
*
* = ( + ) ×
*
* @param sellQuantity
* @param burnQuantity
* @param price
* @returns
*/
calculateSellAmount(sellQuantity: Money, burnQuantity: Money, price: Money): Money {
const effectiveQuantity = sellQuantity.value.plus(burnQuantity.value);
const amount = effectiveQuantity.times(price.value);
return new Money(amount);
}
/**
*
* = ( + × ) ×
*
* @param shareBalance
* @param burnMultiplier
* @param price
* @returns
*/
calculateDisplayAssetValue(
shareBalance: Money,
burnMultiplier: Decimal,
price: Money,
): Money {
// 有效积分股 = 余额 + 余额 × 倍数 = 余额 × (1 + 倍数)
const multiplierFactor = new Decimal(1).plus(burnMultiplier);
const effectiveShares = shareBalance.value.times(multiplierFactor);
const assetValue = effectiveShares.times(price.value);
return new Money(assetValue);
}
/**
*
* = ÷ 24 ÷ 60 ÷ 60
*
* @param dailyAllocation
* @returns
*/
calculateAssetGrowthPerSecond(dailyAllocation: Money): Money {
const secondsPerDay = 24 * 60 * 60; // 86400
const perSecond = dailyAllocation.value.dividedBy(secondsPerDay);
return new Money(perSecond);
}
/**
*
* (100亿 - ) ÷
*
* @param blackHoleAmount
* @param remainingMinutes
* @returns
*/
calculateMinuteBurnRate(blackHoleAmount: Money, remainingMinutes: number): Money {
if (remainingMinutes <= 0) {
return Money.zero();
}
// 剩余需要销毁的量 = 100亿 - 已销毁量
const remainingBurn = TradingCalculatorService.BURN_TARGET.minus(blackHoleAmount.value);
if (remainingBurn.isZero() || remainingBurn.isNegative()) {
return Money.zero();
}
const minuteRate = remainingBurn.dividedBy(remainingMinutes);
return new Money(minuteRate);
}
/**
*
*
* @param activatedAt
* @returns
*/
calculateRemainingMinutes(activatedAt: Date): number {
const now = new Date();
const elapsedMs = now.getTime() - activatedAt.getTime();
const elapsedMinutes = Math.floor(elapsedMs / (60 * 1000));
return Math.max(0, TradingCalculatorService.BURN_PERIOD_MINUTES - elapsedMinutes);
}
/**
*
*
*
* @param currentPrice
* @param sellQuantity
* @param burnQuantity
* @param greenPoints 绿
* @param blackHoleAmount
* @param circulationPoolAmount
* @returns
*/
calculatePriceAfterSell(
greenPoints: Money,
blackHoleAmount: Money,
circulationPoolAmount: Money,
sellQuantity: Money,
burnQuantity: Money,
): Money {
// 卖出后流通池增加sellQuantity黑洞增加burnQuantity
const newBlackHole = blackHoleAmount.add(burnQuantity);
const newCirculation = circulationPoolAmount.add(sellQuantity);
return this.calculatePrice(greenPoints, newBlackHole, newCirculation);
}
}

View File

@ -5,8 +5,15 @@ import { PrismaModule } from './persistence/prisma/prisma.module';
import { TradingAccountRepository } from './persistence/repositories/trading-account.repository';
import { OrderRepository } from './persistence/repositories/order.repository';
import { OutboxRepository } from './persistence/repositories/outbox.repository';
import { TradingConfigRepository } from './persistence/repositories/trading-config.repository';
import { BlackHoleRepository } from './persistence/repositories/black-hole.repository';
import { SharePoolRepository } from './persistence/repositories/share-pool.repository';
import { CirculationPoolRepository } from './persistence/repositories/circulation-pool.repository';
import { PriceSnapshotRepository } from './persistence/repositories/price-snapshot.repository';
import { ProcessedEventRepository } from './persistence/repositories/processed-event.repository';
import { RedisService } from './redis/redis.service';
import { KafkaProducerService } from './kafka/kafka-producer.service';
import { UserRegisteredConsumer } from './kafka/consumers/user-registered.consumer';
@Global()
@Module({
@ -32,10 +39,17 @@ import { KafkaProducerService } from './kafka/kafka-producer.service';
},
]),
],
controllers: [UserRegisteredConsumer],
providers: [
TradingAccountRepository,
OrderRepository,
OutboxRepository,
TradingConfigRepository,
BlackHoleRepository,
SharePoolRepository,
CirculationPoolRepository,
PriceSnapshotRepository,
ProcessedEventRepository,
KafkaProducerService,
{
provide: 'REDIS_OPTIONS',
@ -53,6 +67,12 @@ import { KafkaProducerService } from './kafka/kafka-producer.service';
TradingAccountRepository,
OrderRepository,
OutboxRepository,
TradingConfigRepository,
BlackHoleRepository,
SharePoolRepository,
CirculationPoolRepository,
PriceSnapshotRepository,
ProcessedEventRepository,
KafkaProducerService,
RedisService,
ClientsModule,

View File

@ -0,0 +1 @@
export * from './user-registered.consumer';

View File

@ -0,0 +1,189 @@
import { Controller, Logger, OnModuleInit } from '@nestjs/common';
import { EventPattern, Payload } from '@nestjs/microservices';
import { RedisService } from '../../redis/redis.service';
import { TradingAccountRepository } from '../../persistence/repositories/trading-account.repository';
import { OutboxRepository } from '../../persistence/repositories/outbox.repository';
import { ProcessedEventRepository } from '../../persistence/repositories/processed-event.repository';
import { TradingAccountAggregate } from '../../../domain/aggregates/trading-account.aggregate';
import {
TradingEventTypes,
TradingTopics,
TradingAccountCreatedPayload,
} from '../../../domain/events/trading.events';
// 用户注册事件结构(来自 auth-service
interface UserRegisteredEvent {
eventId: string;
eventType: string;
payload: {
accountSequence: string;
phone: string;
source: 'V1' | 'V2';
registeredAt: string;
};
}
// 4小时 TTL
const IDEMPOTENCY_TTL_SECONDS = 4 * 60 * 60;
@Controller()
export class UserRegisteredConsumer implements OnModuleInit {
private readonly logger = new Logger(UserRegisteredConsumer.name);
constructor(
private readonly redis: RedisService,
private readonly tradingAccountRepository: TradingAccountRepository,
private readonly outboxRepository: OutboxRepository,
private readonly processedEventRepository: ProcessedEventRepository,
) {}
async onModuleInit() {
this.logger.log('UserRegisteredConsumer initialized - listening for user.registered events');
}
@EventPattern('auth.user.registered')
async handleUserRegistered(@Payload() message: any): Promise<void> {
// 解析消息格式
const event: UserRegisteredEvent = message.value || message;
const eventId = event.eventId || message.eventId;
if (!eventId) {
this.logger.warn('Received event without eventId, skipping');
return;
}
const accountSequence = event.payload?.accountSequence;
if (!accountSequence) {
this.logger.warn(`Event ${eventId} missing accountSequence, skipping`);
return;
}
this.logger.debug(
`Processing user registered event: ${eventId}, accountSequence: ${accountSequence}`,
);
// 幂等性检查
if (await this.isEventProcessed(eventId)) {
this.logger.debug(`Event ${eventId} already processed, skipping`);
return;
}
try {
// 检查账户是否已存在
const existingAccount = await this.tradingAccountRepository.findByAccountSequence(
accountSequence,
);
if (existingAccount) {
this.logger.debug(`Trading account ${accountSequence} already exists`);
await this.markEventProcessed(eventId);
return;
}
// 创建交易账户
const account = TradingAccountAggregate.create(accountSequence);
const accountId = await this.tradingAccountRepository.save(account);
// 发布交易账户创建事件
await this.publishAccountCreatedEvent(accountId, accountSequence);
// 标记为已处理
await this.markEventProcessed(eventId);
this.logger.log(
`Trading account created for user ${accountSequence}, source: ${event.payload.source}`,
);
} catch (error) {
// 如果是重复创建的唯一约束错误,忽略
if (error instanceof Error && error.message.includes('Unique constraint')) {
this.logger.debug(
`Trading account already exists for ${accountSequence}, marking as processed`,
);
await this.markEventProcessed(eventId);
return;
}
this.logger.error(
`Failed to create trading account for ${accountSequence}`,
error instanceof Error ? error.stack : error,
);
throw error; // 让 Kafka 重试
}
}
/**
* - Redis + DB
* 1. Redis
* 2. Redis
*/
private async isEventProcessed(eventId: string): Promise<boolean> {
const redisKey = `trading:processed-event:${eventId}`;
// 1. 先检查 Redis 缓存(快速路径)
const cached = await this.redis.get(redisKey);
if (cached) return true;
// 2. 检查数据库Redis 可能过期或重启后丢失)
const dbRecord = await this.processedEventRepository.findByEventId(eventId);
if (dbRecord) {
// 回填 Redis 缓存
await this.redis.set(redisKey, '1', IDEMPOTENCY_TTL_SECONDS);
return true;
}
return false;
}
/**
* - Redis + DB
*/
private async markEventProcessed(eventId: string, eventType: string = 'user.registered'): Promise<void> {
const redisKey = `trading:processed-event:${eventId}`;
// 1. 写入数据库(持久化)
try {
await this.processedEventRepository.create({
eventId,
eventType,
sourceService: 'auth-service',
});
} catch (error) {
// 可能已存在(并发情况),忽略唯一约束错误
if (!(error instanceof Error && error.message.includes('Unique constraint'))) {
throw error;
}
}
// 2. 写入 Redis 缓存4小时 TTL
await this.redis.set(redisKey, '1', IDEMPOTENCY_TTL_SECONDS);
}
/**
*
*/
private async publishAccountCreatedEvent(
accountId: string,
accountSequence: string,
): Promise<void> {
try {
const payload: TradingAccountCreatedPayload = {
accountId,
accountSequence,
createdAt: new Date().toISOString(),
};
await this.outboxRepository.create({
aggregateType: 'TradingAccount',
aggregateId: accountId,
eventType: TradingEventTypes.TRADING_ACCOUNT_CREATED,
payload,
topic: TradingTopics.ACCOUNTS,
key: accountSequence,
});
this.logger.debug(`Published TradingAccountCreated event for ${accountSequence}`);
} catch (error) {
this.logger.error(`Failed to publish TradingAccountCreated event: ${error}`);
}
}
}

View File

@ -0,0 +1,190 @@
import { Injectable } from '@nestjs/common';
import { PrismaService } from '../prisma/prisma.service';
import { Money } from '../../../domain/value-objects/money.vo';
import Decimal from 'decimal.js';
export interface BlackHoleEntity {
id: string;
totalBurned: Money;
targetBurn: Money;
remainingBurn: Money;
lastBurnMinute: Date | null;
}
export interface BurnRecordEntity {
id: string;
blackHoleId: string;
burnMinute: Date;
burnAmount: Money;
remainingTarget: Money;
sourceType: string | null;
sourceAccountSeq: string | null;
sourceOrderNo: string | null;
memo: string | null;
createdAt: Date;
}
export type BurnSourceType = 'MINUTE_BURN' | 'SELL_BURN';
@Injectable()
export class BlackHoleRepository {
constructor(private readonly prisma: PrismaService) {}
async getBlackHole(): Promise<BlackHoleEntity | null> {
const record = await this.prisma.blackHole.findFirst();
if (!record) {
return null;
}
return this.toDomain(record);
}
async initializeBlackHole(targetBurn: Money): Promise<BlackHoleEntity> {
const existing = await this.prisma.blackHole.findFirst();
if (existing) {
return this.toDomain(existing);
}
const record = await this.prisma.blackHole.create({
data: {
totalBurned: 0,
targetBurn: targetBurn.value,
remainingBurn: targetBurn.value,
},
});
return this.toDomain(record);
}
/**
*
*/
async recordMinuteBurn(burnMinute: Date, burnAmount: Money): Promise<BurnRecordEntity> {
return this.recordBurn(burnMinute, burnAmount, 'MINUTE_BURN');
}
/**
*
*/
async recordSellBurn(
burnMinute: Date,
burnAmount: Money,
accountSeq: string,
orderNo: string,
): Promise<BurnRecordEntity> {
return this.recordBurn(burnMinute, burnAmount, 'SELL_BURN', accountSeq, orderNo);
}
/**
*
*/
private async recordBurn(
burnMinute: Date,
burnAmount: Money,
sourceType: BurnSourceType,
sourceAccountSeq?: string,
sourceOrderNo?: string,
): Promise<BurnRecordEntity> {
const blackHole = await this.prisma.blackHole.findFirst();
if (!blackHole) {
throw new Error('Black hole not initialized');
}
const newTotalBurned = new Decimal(blackHole.totalBurned.toString()).plus(burnAmount.value);
const newRemainingBurn = new Decimal(blackHole.targetBurn.toString()).minus(newTotalBurned);
const memo =
sourceType === 'MINUTE_BURN'
? `每分钟自动销毁 ${burnAmount.toFixed(8)}`
: `卖出销毁, 账户[${sourceAccountSeq}], 订单[${sourceOrderNo}], 数量${burnAmount.toFixed(8)}`;
const [, burnRecord] = await this.prisma.$transaction([
this.prisma.blackHole.update({
where: { id: blackHole.id },
data: {
totalBurned: newTotalBurned,
remainingBurn: newRemainingBurn.isNegative() ? 0 : newRemainingBurn,
lastBurnMinute: burnMinute,
},
}),
this.prisma.burnRecord.create({
data: {
blackHoleId: blackHole.id,
burnMinute,
burnAmount: burnAmount.value,
remainingTarget: newRemainingBurn.isNegative() ? 0 : newRemainingBurn,
sourceType,
sourceAccountSeq,
sourceOrderNo,
memo,
},
}),
]);
return this.toBurnRecordDomain(burnRecord);
}
async getBurnRecords(
page: number,
pageSize: number,
sourceType?: BurnSourceType,
): Promise<{
data: BurnRecordEntity[];
total: number;
}> {
const where = sourceType ? { sourceType } : {};
const [records, total] = await Promise.all([
this.prisma.burnRecord.findMany({
where,
orderBy: { burnMinute: 'desc' },
skip: (page - 1) * pageSize,
take: pageSize,
}),
this.prisma.burnRecord.count({ where }),
]);
return {
data: records.map((r) => this.toBurnRecordDomain(r)),
total,
};
}
async getTodayBurnAmount(): Promise<Money> {
const today = new Date();
today.setHours(0, 0, 0, 0);
const result = await this.prisma.burnRecord.aggregate({
where: {
burnMinute: { gte: today },
},
_sum: { burnAmount: true },
});
return new Money(result._sum.burnAmount || 0);
}
private toDomain(record: any): BlackHoleEntity {
return {
id: record.id,
totalBurned: new Money(record.totalBurned),
targetBurn: new Money(record.targetBurn),
remainingBurn: new Money(record.remainingBurn),
lastBurnMinute: record.lastBurnMinute,
};
}
private toBurnRecordDomain(record: any): BurnRecordEntity {
return {
id: record.id,
blackHoleId: record.blackHoleId,
burnMinute: record.burnMinute,
burnAmount: new Money(record.burnAmount),
remainingTarget: new Money(record.remainingTarget),
sourceType: record.sourceType,
sourceAccountSeq: record.sourceAccountSeq,
sourceOrderNo: record.sourceOrderNo,
memo: record.memo,
createdAt: record.createdAt,
};
}
}

View File

@ -0,0 +1,199 @@
import { Injectable } from '@nestjs/common';
import { PrismaService } from '../prisma/prisma.service';
import { Money } from '../../../domain/value-objects/money.vo';
import Decimal from 'decimal.js';
export interface CirculationPoolEntity {
id: string;
totalShares: Money;
totalCash: Money;
totalInflow: Money;
totalOutflow: Money;
createdAt: Date;
updatedAt: Date;
}
export type CirculationPoolTransactionType =
| 'SHARE_IN'
| 'SHARE_OUT'
| 'CASH_IN'
| 'CASH_OUT'
| 'TRADE_BUY'
| 'TRADE_SELL';
@Injectable()
export class CirculationPoolRepository {
constructor(private readonly prisma: PrismaService) {}
async getPool(): Promise<CirculationPoolEntity | null> {
const record = await this.prisma.circulationPool.findFirst();
if (!record) {
return null;
}
return this.toDomain(record);
}
async initializePool(): Promise<CirculationPoolEntity> {
const existing = await this.prisma.circulationPool.findFirst();
if (existing) {
return this.toDomain(existing);
}
const record = await this.prisma.circulationPool.create({
data: {
totalShares: 0,
totalCash: 0,
totalInflow: 0,
totalOutflow: 0,
},
});
return this.toDomain(record);
}
/**
*
*/
async addSharesFromSell(
amount: Money,
accountSeq: string,
orderId: string,
memo?: string,
): Promise<void> {
const pool = await this.prisma.circulationPool.findFirst();
if (!pool) {
throw new Error('Circulation pool not initialized');
}
const balanceBefore = new Decimal(pool.totalShares.toString());
const balanceAfter = balanceBefore.plus(amount.value);
await this.prisma.$transaction([
this.prisma.circulationPool.update({
where: { id: pool.id },
data: {
totalShares: balanceAfter,
totalInflow: new Decimal(pool.totalInflow.toString()).plus(amount.value),
},
}),
this.prisma.circulationPoolTransaction.create({
data: {
poolId: pool.id,
type: 'TRADE_SELL',
assetType: 'SHARE',
amount: amount.value,
balanceBefore,
balanceAfter,
counterpartyType: 'USER',
counterpartyAccountSeq: accountSeq,
referenceId: orderId,
referenceType: 'ORDER',
memo: memo || `卖出积分股进入流通池 ${amount.toFixed(8)}`,
},
}),
]);
}
/**
*
*/
async removeSharesForBuy(
amount: Money,
accountSeq: string,
orderId: string,
memo?: string,
): Promise<void> {
const pool = await this.prisma.circulationPool.findFirst();
if (!pool) {
throw new Error('Circulation pool not initialized');
}
const balanceBefore = new Decimal(pool.totalShares.toString());
const balanceAfter = balanceBefore.minus(amount.value);
if (balanceAfter.isNegative()) {
throw new Error('Insufficient shares in circulation pool');
}
await this.prisma.$transaction([
this.prisma.circulationPool.update({
where: { id: pool.id },
data: {
totalShares: balanceAfter,
totalOutflow: new Decimal(pool.totalOutflow.toString()).plus(amount.value),
},
}),
this.prisma.circulationPoolTransaction.create({
data: {
poolId: pool.id,
type: 'TRADE_BUY',
assetType: 'SHARE',
amount: amount.value,
balanceBefore,
balanceAfter,
counterpartyType: 'USER',
counterpartyAccountSeq: accountSeq,
referenceId: orderId,
referenceType: 'ORDER',
memo: memo || `买入积分股从流通池流出 ${amount.toFixed(8)}`,
},
}),
]);
}
/**
*
*/
async getSharesAmount(): Promise<Money> {
const pool = await this.getPool();
if (!pool) {
return Money.zero();
}
return pool.totalShares;
}
async getTransactions(
page: number,
pageSize: number,
): Promise<{
data: any[];
total: number;
}> {
const pool = await this.prisma.circulationPool.findFirst();
if (!pool) {
return { data: [], total: 0 };
}
const [records, total] = await Promise.all([
this.prisma.circulationPoolTransaction.findMany({
where: { poolId: pool.id },
orderBy: { createdAt: 'desc' },
skip: (page - 1) * pageSize,
take: pageSize,
}),
this.prisma.circulationPoolTransaction.count({ where: { poolId: pool.id } }),
]);
return {
data: records.map((r) => ({
...r,
amount: r.amount.toString(),
balanceBefore: r.balanceBefore.toString(),
balanceAfter: r.balanceAfter.toString(),
})),
total,
};
}
private toDomain(record: any): CirculationPoolEntity {
return {
id: record.id,
totalShares: new Money(record.totalShares),
totalCash: new Money(record.totalCash),
totalInflow: new Money(record.totalInflow),
totalOutflow: new Money(record.totalOutflow),
createdAt: record.createdAt,
updatedAt: record.updatedAt,
};
}
}

View File

@ -47,6 +47,46 @@ export class OrderRepository {
}
}
/**
*
*/
async saveWithBurnInfo(
aggregate: OrderAggregate,
burnQuantity: Money,
effectiveQuantity: Money,
): Promise<string> {
const data = {
orderNo: aggregate.orderNo,
accountSequence: aggregate.accountSequence,
type: aggregate.type,
status: aggregate.status,
price: aggregate.price.value,
quantity: aggregate.quantity.value,
filledQuantity: aggregate.filledQuantity.value,
remainingQuantity: aggregate.remainingQuantity.value,
averagePrice: aggregate.averagePrice.value,
totalAmount: aggregate.totalAmount.value,
burnQuantity: burnQuantity.value,
burnMultiplier: burnQuantity.isZero()
? 0
: burnQuantity.value.dividedBy(aggregate.filledQuantity.value),
effectiveQuantity: effectiveQuantity.value,
cancelledAt: aggregate.cancelledAt,
completedAt: aggregate.completedAt,
};
if (aggregate.id) {
await this.prisma.order.update({
where: { id: aggregate.id },
data,
});
return aggregate.id;
} else {
const created = await this.prisma.order.create({ data });
return created.id;
}
}
async findActiveOrders(type?: OrderType): Promise<OrderAggregate[]> {
const where: any = {
status: { in: [OrderStatus.PENDING, OrderStatus.PARTIAL] },

View File

@ -0,0 +1,134 @@
import { Injectable } from '@nestjs/common';
import { PrismaService } from '../prisma/prisma.service';
import { Money } from '../../../domain/value-objects/money.vo';
export interface PriceSnapshotEntity {
id: string;
snapshotTime: Date;
price: Money;
greenPoints: Money;
blackHoleAmount: Money;
circulationPool: Money;
effectiveDenominator: Money;
minuteBurnRate: Money;
createdAt: Date;
}
@Injectable()
export class PriceSnapshotRepository {
constructor(private readonly prisma: PrismaService) {}
async getLatestSnapshot(): Promise<PriceSnapshotEntity | null> {
const record = await this.prisma.priceSnapshot.findFirst({
orderBy: { snapshotTime: 'desc' },
});
if (!record) {
return null;
}
return this.toDomain(record);
}
async getSnapshotAt(time: Date): Promise<PriceSnapshotEntity | null> {
// 获取指定时间之前最近的快照
const record = await this.prisma.priceSnapshot.findFirst({
where: { snapshotTime: { lte: time } },
orderBy: { snapshotTime: 'desc' },
});
if (!record) {
return null;
}
return this.toDomain(record);
}
async createSnapshot(data: {
snapshotTime: Date;
price: Money;
greenPoints: Money;
blackHoleAmount: Money;
circulationPool: Money;
effectiveDenominator: Money;
minuteBurnRate: Money;
}): Promise<PriceSnapshotEntity> {
const record = await this.prisma.priceSnapshot.create({
data: {
snapshotTime: data.snapshotTime,
price: data.price.value,
greenPoints: data.greenPoints.value,
blackHoleAmount: data.blackHoleAmount.value,
circulationPool: data.circulationPool.value,
effectiveDenominator: data.effectiveDenominator.value,
minuteBurnRate: data.minuteBurnRate.value,
},
});
return this.toDomain(record);
}
async getPriceHistory(
startTime: Date,
endTime: Date,
limit: number = 1440,
): Promise<PriceSnapshotEntity[]> {
const records = await this.prisma.priceSnapshot.findMany({
where: {
snapshotTime: {
gte: startTime,
lte: endTime,
},
},
orderBy: { snapshotTime: 'asc' },
take: limit,
});
return records.map((r) => this.toDomain(r));
}
async getSnapshots(
page: number,
pageSize: number,
): Promise<{
data: PriceSnapshotEntity[];
total: number;
}> {
const [records, total] = await Promise.all([
this.prisma.priceSnapshot.findMany({
orderBy: { snapshotTime: 'desc' },
skip: (page - 1) * pageSize,
take: pageSize,
}),
this.prisma.priceSnapshot.count(),
]);
return {
data: records.map((r) => this.toDomain(r)),
total,
};
}
/**
*
*/
async cleanupOldSnapshots(retentionDays: number): Promise<number> {
const cutoffDate = new Date();
cutoffDate.setDate(cutoffDate.getDate() - retentionDays);
const result = await this.prisma.priceSnapshot.deleteMany({
where: { snapshotTime: { lt: cutoffDate } },
});
return result.count;
}
private toDomain(record: any): PriceSnapshotEntity {
return {
id: record.id,
snapshotTime: record.snapshotTime,
price: new Money(record.price),
greenPoints: new Money(record.greenPoints),
blackHoleAmount: new Money(record.blackHoleAmount),
circulationPool: new Money(record.circulationPool),
effectiveDenominator: new Money(record.effectiveDenominator),
minuteBurnRate: new Money(record.minuteBurnRate),
createdAt: record.createdAt,
};
}
}

View File

@ -0,0 +1,65 @@
import { Injectable } from '@nestjs/common';
import { PrismaService } from '../prisma/prisma.service';
export interface ProcessedEventEntity {
id: string;
eventId: string;
eventType: string;
sourceService: string;
processedAt: Date;
}
@Injectable()
export class ProcessedEventRepository {
constructor(private readonly prisma: PrismaService) {}
/**
*
*/
async findByEventId(eventId: string): Promise<ProcessedEventEntity | null> {
const record = await this.prisma.processedEvent.findUnique({
where: { eventId },
});
return record;
}
/**
*
*/
async create(data: {
eventId: string;
eventType: string;
sourceService: string;
}): Promise<ProcessedEventEntity> {
return this.prisma.processedEvent.create({
data: {
eventId: data.eventId,
eventType: data.eventType,
sourceService: data.sourceService,
},
});
}
/**
*
*/
async isProcessed(eventId: string): Promise<boolean> {
const count = await this.prisma.processedEvent.count({
where: { eventId },
});
return count > 0;
}
/**
*
* @param before
*/
async deleteOldRecords(before: Date): Promise<number> {
const result = await this.prisma.processedEvent.deleteMany({
where: {
processedAt: { lt: before },
},
});
return result.count;
}
}

View File

@ -0,0 +1,191 @@
import { Injectable } from '@nestjs/common';
import { PrismaService } from '../prisma/prisma.service';
import { Money } from '../../../domain/value-objects/money.vo';
import Decimal from 'decimal.js';
export interface SharePoolEntity {
id: string;
greenPoints: Money;
totalInflow: Money;
totalOutflow: Money;
createdAt: Date;
updatedAt: Date;
}
export type SharePoolTransactionType = 'INJECT' | 'TRADE_IN' | 'TRADE_OUT';
export interface SharePoolTransactionEntity {
id: string;
poolId: string;
type: SharePoolTransactionType;
amount: Money;
balanceBefore: Money;
balanceAfter: Money;
referenceId: string | null;
referenceType: string | null;
memo: string | null;
createdAt: Date;
}
@Injectable()
export class SharePoolRepository {
constructor(private readonly prisma: PrismaService) {}
async getPool(): Promise<SharePoolEntity | null> {
const record = await this.prisma.sharePool.findFirst();
if (!record) {
return null;
}
return this.toDomain(record);
}
async initializePool(initialGreenPoints: Money = Money.zero()): Promise<SharePoolEntity> {
const existing = await this.prisma.sharePool.findFirst();
if (existing) {
return this.toDomain(existing);
}
const record = await this.prisma.sharePool.create({
data: {
greenPoints: initialGreenPoints.value,
totalInflow: initialGreenPoints.value,
totalOutflow: 0,
},
});
return this.toDomain(record);
}
/**
* 绿
*/
async inject(amount: Money, referenceId?: string, memo?: string): Promise<void> {
await this.updateBalance('INJECT', amount, true, referenceId, memo);
}
/**
* 绿
*/
async tradeIn(amount: Money, tradeId: string): Promise<void> {
await this.updateBalance('TRADE_IN', amount, true, tradeId, `交易买入流入 ${amount.toFixed(8)}`);
}
/**
* 绿
*/
async tradeOut(amount: Money, tradeId: string): Promise<void> {
await this.updateBalance(
'TRADE_OUT',
amount,
false,
tradeId,
`交易卖出流出 ${amount.toFixed(8)}`,
);
}
private async updateBalance(
type: SharePoolTransactionType,
amount: Money,
isInflow: boolean,
referenceId?: string,
memo?: string,
): Promise<void> {
const pool = await this.prisma.sharePool.findFirst();
if (!pool) {
throw new Error('Share pool not initialized');
}
const balanceBefore = new Decimal(pool.greenPoints.toString());
const balanceAfter = isInflow
? balanceBefore.plus(amount.value)
: balanceBefore.minus(amount.value);
if (balanceAfter.isNegative()) {
throw new Error('Insufficient green points in share pool');
}
const newTotalInflow = isInflow
? new Decimal(pool.totalInflow.toString()).plus(amount.value)
: pool.totalInflow;
const newTotalOutflow = !isInflow
? new Decimal(pool.totalOutflow.toString()).plus(amount.value)
: pool.totalOutflow;
await this.prisma.$transaction([
this.prisma.sharePool.update({
where: { id: pool.id },
data: {
greenPoints: balanceAfter,
totalInflow: newTotalInflow,
totalOutflow: newTotalOutflow,
},
}),
this.prisma.sharePoolTransaction.create({
data: {
poolId: pool.id,
type,
amount: amount.value,
balanceBefore,
balanceAfter,
referenceId,
referenceType: type === 'INJECT' ? 'INJECT' : 'TRADE',
memo,
},
}),
]);
}
async getTransactions(
page: number,
pageSize: number,
): Promise<{
data: SharePoolTransactionEntity[];
total: number;
}> {
const pool = await this.prisma.sharePool.findFirst();
if (!pool) {
return { data: [], total: 0 };
}
const [records, total] = await Promise.all([
this.prisma.sharePoolTransaction.findMany({
where: { poolId: pool.id },
orderBy: { createdAt: 'desc' },
skip: (page - 1) * pageSize,
take: pageSize,
}),
this.prisma.sharePoolTransaction.count({ where: { poolId: pool.id } }),
]);
return {
data: records.map((r) => this.toTransactionDomain(r)),
total,
};
}
private toDomain(record: any): SharePoolEntity {
return {
id: record.id,
greenPoints: new Money(record.greenPoints),
totalInflow: new Money(record.totalInflow),
totalOutflow: new Money(record.totalOutflow),
createdAt: record.createdAt,
updatedAt: record.updatedAt,
};
}
private toTransactionDomain(record: any): SharePoolTransactionEntity {
return {
id: record.id,
poolId: record.poolId,
type: record.type as SharePoolTransactionType,
amount: new Money(record.amount),
balanceBefore: new Money(record.balanceBefore),
balanceAfter: new Money(record.balanceAfter),
referenceId: record.referenceId,
referenceType: record.referenceType,
memo: record.memo,
createdAt: record.createdAt,
};
}
}

View File

@ -15,11 +15,11 @@ export class TradingAccountRepository {
return this.toDomain(record);
}
async save(aggregate: TradingAccountAggregate): Promise<void> {
async save(aggregate: TradingAccountAggregate): Promise<string> {
const transactions = aggregate.pendingTransactions;
await this.prisma.$transaction(async (tx) => {
await tx.tradingAccount.upsert({
const result = await this.prisma.$transaction(async (tx) => {
const account = await tx.tradingAccount.upsert({
where: { accountSequence: aggregate.accountSequence },
create: {
accountSequence: aggregate.accountSequence,
@ -55,9 +55,12 @@ export class TradingAccountRepository {
})),
});
}
return account.id;
});
aggregate.clearPendingTransactions();
return result;
}
async getTransactions(

View File

@ -0,0 +1,101 @@
import { Injectable } from '@nestjs/common';
import { PrismaService } from '../prisma/prisma.service';
import { Money } from '../../../domain/value-objects/money.vo';
import Decimal from 'decimal.js';
export interface TradingConfigEntity {
id: string;
totalShares: Money;
burnTarget: Money;
burnPeriodMinutes: number;
minuteBurnRate: Money;
isActive: boolean;
activatedAt: Date | null;
createdAt: Date;
updatedAt: Date;
}
@Injectable()
export class TradingConfigRepository {
constructor(private readonly prisma: PrismaService) {}
async getConfig(): Promise<TradingConfigEntity | null> {
const record = await this.prisma.tradingConfig.findFirst();
if (!record) {
return null;
}
return this.toDomain(record);
}
async initializeConfig(): Promise<TradingConfigEntity> {
const existing = await this.prisma.tradingConfig.findFirst();
if (existing) {
return this.toDomain(existing);
}
const record = await this.prisma.tradingConfig.create({
data: {
totalShares: new Decimal('100020000000'),
burnTarget: new Decimal('10000000000'),
burnPeriodMinutes: 2102400, // 365 * 4 * 1440
minuteBurnRate: new Decimal('4756.468797564687'),
isActive: false,
},
});
return this.toDomain(record);
}
async activate(): Promise<void> {
const config = await this.prisma.tradingConfig.findFirst();
if (!config) {
throw new Error('Trading config not initialized');
}
await this.prisma.tradingConfig.update({
where: { id: config.id },
data: {
isActive: true,
activatedAt: new Date(),
},
});
}
async deactivate(): Promise<void> {
const config = await this.prisma.tradingConfig.findFirst();
if (!config) {
return;
}
await this.prisma.tradingConfig.update({
where: { id: config.id },
data: { isActive: false },
});
}
async updateMinuteBurnRate(newRate: Money): Promise<void> {
const config = await this.prisma.tradingConfig.findFirst();
if (!config) {
throw new Error('Trading config not initialized');
}
await this.prisma.tradingConfig.update({
where: { id: config.id },
data: { minuteBurnRate: newRate.value },
});
}
private toDomain(record: any): TradingConfigEntity {
return {
id: record.id,
totalShares: new Money(record.totalShares),
burnTarget: new Money(record.burnTarget),
burnPeriodMinutes: record.burnPeriodMinutes,
minuteBurnRate: new Money(record.minuteBurnRate),
isActive: record.isActive,
activatedAt: record.activatedAt,
createdAt: record.createdAt,
updatedAt: record.updatedAt,
};
}
}

View File

@ -3,10 +3,15 @@ const nextConfig = {
reactStrictMode: true,
output: 'standalone',
async rewrites() {
// NEXT_PUBLIC_API_URL 应该是后端服务的基础 URL如 http://mining-admin-service:3023
// 前端请求 /api/xxx 会被转发到 {API_URL}/api/v2/xxx
const apiBaseUrl = process.env.NEXT_PUBLIC_API_URL || 'http://localhost:3023';
// 移除末尾可能存在的 /api/v2 避免重复
const cleanUrl = apiBaseUrl.replace(/\/api\/v2\/?$/, '');
return [
{
source: '/api/:path*',
destination: `${process.env.NEXT_PUBLIC_API_URL || 'http://localhost:3023'}/api/v2/:path*`,
destination: `${cleanUrl}/api/v2/:path*`,
},
];
},

View File

@ -36,20 +36,22 @@ const actionLabels: Record<string, { label: string; className: string }> = {
export default function AuditLogsPage() {
const [page, setPage] = useState(1);
const [action, setAction] = useState<string>('');
const [action, setAction] = useState<string>('all');
const [keyword, setKeyword] = useState('');
const pageSize = 20;
const { data, isLoading } = useQuery({
const { data, isLoading, error } = useQuery({
queryKey: ['audit-logs', page, action, keyword],
queryFn: async () => {
const response = await apiClient.get('/audit', {
params: { page, pageSize, action: action || undefined, keyword: keyword || undefined },
params: { page, pageSize, action: action === 'all' ? undefined : action, keyword: keyword || undefined },
});
return response.data.data as PaginatedResponse<AuditLog>;
},
});
const items = data?.items ?? [];
return (
<div className="space-y-6">
<PageHeader title="审计日志" description="查看系统操作日志" />
@ -71,7 +73,7 @@ export default function AuditLogsPage() {
<SelectValue placeholder="操作类型" />
</SelectTrigger>
<SelectContent>
<SelectItem value=""></SelectItem>
<SelectItem value="all"></SelectItem>
<SelectItem value="CREATE"></SelectItem>
<SelectItem value="UPDATE"></SelectItem>
<SelectItem value="DELETE"></SelectItem>
@ -108,14 +110,14 @@ export default function AuditLogsPage() {
))}
</TableRow>
))
) : data?.items.length === 0 ? (
) : items.length === 0 ? (
<TableRow>
<TableCell colSpan={7} className="text-center py-8 text-muted-foreground">
</TableCell>
</TableRow>
) : (
data?.items.map((log) => {
items.map((log) => {
const actionInfo = actionLabels[log.action] || { label: log.action, className: '' };
return (
<TableRow key={log.id}>

View File

@ -2,7 +2,7 @@
import { useState } from 'react';
import { PageHeader } from '@/components/layout/page-header';
import { useConfigs, useUpdateConfig, useTransferEnabled, useSetTransferEnabled } from '@/features/configs/hooks/use-configs';
import { useConfigs, useUpdateConfig, useTransferEnabled, useSetTransferEnabled, useMiningStatus, useActivateMining, useDeactivateMining } from '@/features/configs/hooks/use-configs';
import { Card, CardContent, CardHeader, CardTitle, CardDescription } from '@/components/ui/card';
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from '@/components/ui/table';
import { Button } from '@/components/ui/button';
@ -11,7 +11,8 @@ import { Switch } from '@/components/ui/switch';
import { Dialog, DialogContent, DialogHeader, DialogTitle, DialogFooter } from '@/components/ui/dialog';
import { Label } from '@/components/ui/label';
import { Skeleton } from '@/components/ui/skeleton';
import { Pencil, Save, X } from 'lucide-react';
import { Badge } from '@/components/ui/badge';
import { Pencil, Save, X, Play, Pause, AlertCircle, CheckCircle2 } from 'lucide-react';
import type { SystemConfig } from '@/types/config';
const categoryLabels: Record<string, string> = {
@ -24,8 +25,11 @@ const categoryLabels: Record<string, string> = {
export default function ConfigsPage() {
const { data: configs, isLoading } = useConfigs();
const { data: transferEnabled, isLoading: transferLoading } = useTransferEnabled();
const { data: miningStatus, isLoading: miningLoading } = useMiningStatus();
const updateConfig = useUpdateConfig();
const setTransferEnabled = useSetTransferEnabled();
const activateMining = useActivateMining();
const deactivateMining = useDeactivateMining();
const [editingConfig, setEditingConfig] = useState<SystemConfig | null>(null);
const [editValue, setEditValue] = useState('');
@ -58,10 +62,123 @@ export default function ConfigsPage() {
{} as Record<string, SystemConfig[]>
);
const formatNumber = (value: string) => {
return parseFloat(value).toLocaleString();
};
return (
<div className="space-y-6">
<PageHeader title="配置管理" description="管理系统配置参数" />
{/* 挖矿状态卡片 */}
<Card>
<CardHeader>
<div className="flex items-center justify-between">
<div>
<CardTitle className="text-lg"></CardTitle>
<CardDescription></CardDescription>
</div>
{miningLoading ? (
<Skeleton className="h-6 w-16" />
) : miningStatus?.error ? (
<Badge variant="destructive" className="flex items-center gap-1">
<AlertCircle className="h-3 w-3" />
</Badge>
) : miningStatus?.isActive ? (
<Badge variant="default" className="flex items-center gap-1 bg-green-500">
<CheckCircle2 className="h-3 w-3" />
</Badge>
) : (
<Badge variant="secondary" className="flex items-center gap-1">
<Pause className="h-3 w-3" />
</Badge>
)}
</div>
</CardHeader>
<CardContent>
{miningLoading ? (
<Skeleton className="h-32 w-full" />
) : miningStatus?.error ? (
<div className="text-center py-4 text-muted-foreground">
<AlertCircle className="h-8 w-8 mx-auto mb-2 text-destructive" />
<p></p>
<p className="text-sm">{miningStatus.error}</p>
</div>
) : !miningStatus?.initialized ? (
<div className="text-center py-4 text-muted-foreground">
<AlertCircle className="h-8 w-8 mx-auto mb-2 text-yellow-500" />
<p></p>
<p className="text-sm"> seed </p>
</div>
) : (
<div className="space-y-4">
<div className="grid grid-cols-2 md:grid-cols-4 gap-4">
<div className="space-y-1">
<p className="text-sm text-muted-foreground"></p>
<p className="text-lg font-semibold"> {miningStatus.currentEra} </p>
</div>
<div className="space-y-1">
<p className="text-sm text-muted-foreground"></p>
<p className="text-lg font-semibold">{formatNumber(miningStatus.remainingDistribution)}</p>
</div>
<div className="space-y-1">
<p className="text-sm text-muted-foreground"></p>
<p className="text-lg font-semibold">{formatNumber(miningStatus.secondDistribution)}</p>
</div>
<div className="space-y-1">
<p className="text-sm text-muted-foreground"></p>
<p className="text-lg font-semibold">{miningStatus.accountCount}</p>
</div>
</div>
{miningStatus.blackHole && (
<div className="pt-4 border-t">
<p className="text-sm font-medium mb-2"></p>
<div className="grid grid-cols-3 gap-4">
<div className="space-y-1">
<p className="text-sm text-muted-foreground"></p>
<p className="font-semibold">{formatNumber(miningStatus.blackHole.totalBurned)}</p>
</div>
<div className="space-y-1">
<p className="text-sm text-muted-foreground"></p>
<p className="font-semibold">{formatNumber(miningStatus.blackHole.targetBurn)}</p>
</div>
<div className="space-y-1">
<p className="text-sm text-muted-foreground"></p>
<p className="font-semibold">{formatNumber(miningStatus.blackHole.remainingBurn)}</p>
</div>
</div>
</div>
)}
<div className="flex justify-end pt-4 border-t">
{miningStatus.isActive ? (
<Button
variant="destructive"
onClick={() => deactivateMining.mutate()}
disabled={deactivateMining.isPending}
>
<Pause className="h-4 w-4 mr-2" />
{deactivateMining.isPending ? '停用中...' : '停用挖矿'}
</Button>
) : (
<Button
onClick={() => activateMining.mutate()}
disabled={activateMining.isPending}
>
<Play className="h-4 w-4 mr-2" />
{activateMining.isPending ? '激活中...' : '激活挖矿'}
</Button>
)}
</div>
</div>
)}
</CardContent>
</Card>
<Card>
<CardHeader>
<CardTitle className="text-lg"></CardTitle>

View File

@ -1,168 +0,0 @@
'use client';
import { useState } from 'react';
import { PageHeader } from '@/components/layout/page-header';
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
import { apiClient } from '@/lib/api/client';
import { formatDateTime } from '@/lib/utils/date';
import { Card, CardContent, CardHeader, CardTitle, CardDescription } from '@/components/ui/card';
import { Button } from '@/components/ui/button';
import { Dialog, DialogContent, DialogHeader, DialogTitle, DialogDescription, DialogFooter } from '@/components/ui/dialog';
import { Skeleton } from '@/components/ui/skeleton';
import { useToast } from '@/lib/hooks/use-toast';
import { Play, CheckCircle, AlertCircle, Loader2 } from 'lucide-react';
interface InitializationStatus {
initialized: boolean;
initializedAt: string | null;
initializedBy: string | null;
distributionPoolBalance: string;
blackHoleBalance: string;
circulationPoolBalance: string;
}
export default function InitializationPage() {
const queryClient = useQueryClient();
const { toast } = useToast();
const [showConfirm, setShowConfirm] = useState(false);
const { data: status, isLoading } = useQuery({
queryKey: ['initialization', 'status'],
queryFn: async () => {
const response = await apiClient.get('/initialization/status');
return response.data.data as InitializationStatus;
},
});
const initializeMutation = useMutation({
mutationFn: async () => {
const response = await apiClient.post('/initialization/initialize');
return response.data;
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ['initialization', 'status'] });
toast({ title: '初始化成功', variant: 'success' as any });
setShowConfirm(false);
},
onError: () => {
toast({ title: '初始化失败', variant: 'destructive' });
},
});
const handleInitialize = () => {
initializeMutation.mutate();
};
return (
<div className="space-y-6">
<PageHeader title="系统初始化" description="初始化挖矿系统的基础数据" />
<Card>
<CardHeader>
<CardTitle className="text-lg"></CardTitle>
<CardDescription></CardDescription>
</CardHeader>
<CardContent>
{isLoading ? (
<Skeleton className="h-24 w-full" />
) : (
<div className="space-y-4">
<div className="flex items-center gap-4">
{status?.initialized ? (
<>
<CheckCircle className="h-8 w-8 text-green-500" />
<div>
<p className="font-medium text-green-600"></p>
<p className="text-sm text-muted-foreground">
: {formatDateTime(status.initializedAt)}
</p>
<p className="text-sm text-muted-foreground">: {status.initializedBy}</p>
</div>
</>
) : (
<>
<AlertCircle className="h-8 w-8 text-yellow-500" />
<div>
<p className="font-medium text-yellow-600"></p>
<p className="text-sm text-muted-foreground"></p>
</div>
</>
)}
</div>
{!status?.initialized && (
<Button onClick={() => setShowConfirm(true)} className="mt-4">
<Play className="h-4 w-4 mr-2" />
</Button>
)}
</div>
)}
</CardContent>
</Card>
{status?.initialized && (
<Card>
<CardHeader>
<CardTitle className="text-lg"></CardTitle>
</CardHeader>
<CardContent>
<div className="grid grid-cols-3 gap-6">
<div>
<p className="text-sm text-muted-foreground"></p>
<p className="text-xl font-bold font-mono">{status.distributionPoolBalance}</p>
</div>
<div>
<p className="text-sm text-muted-foreground"></p>
<p className="text-xl font-bold font-mono">{status.blackHoleBalance}</p>
</div>
<div>
<p className="text-sm text-muted-foreground"></p>
<p className="text-xl font-bold font-mono">{status.circulationPoolBalance}</p>
</div>
</div>
</CardContent>
</Card>
)}
<Card>
<CardHeader>
<CardTitle className="text-lg"></CardTitle>
</CardHeader>
<CardContent className="prose prose-sm max-w-none">
<p>:</p>
<ol className="list-decimal list-inside space-y-2 text-muted-foreground">
<li> 2 亿</li>
<li> 0</li>
<li> 0</li>
<li> (12% )</li>
<li> (1% )</li>
<li> (2% )</li>
<li></li>
</ol>
<p className="text-yellow-600 mt-4">注意: 初始化操作只能执行一次</p>
</CardContent>
</Card>
<Dialog open={showConfirm} onOpenChange={setShowConfirm}>
<DialogContent>
<DialogHeader>
<DialogTitle></DialogTitle>
<DialogDescription>
?
</DialogDescription>
</DialogHeader>
<DialogFooter>
<Button variant="outline" onClick={() => setShowConfirm(false)}>
</Button>
<Button onClick={handleInitialize} disabled={initializeMutation.isPending}>
{initializeMutation.isPending && <Loader2 className="h-4 w-4 mr-2 animate-spin" />}
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
</div>
);
}

View File

@ -1,6 +1,6 @@
'use client';
import { useEffect } from 'react';
import { useEffect, useState } from 'react';
import { useRouter } from 'next/navigation';
import { useAppDispatch, useAppSelector } from '@/store/hooks';
import { getProfile } from '@/store/slices/auth.slice';
@ -15,17 +15,34 @@ export default function DashboardLayout({ children }: { children: React.ReactNod
const dispatch = useAppDispatch();
const { token, isAuthenticated, user } = useAppSelector((state) => state.auth);
const { isCollapsed } = useSidebar();
const [isInitialized, setIsInitialized] = useState(false);
// 等待客户端 hydration 完成后再检查 token
useEffect(() => {
// 在客户端检查 localStorage 中是否有 token
const storedToken = localStorage.getItem('admin_token');
if (!storedToken) {
// 确实没有 token跳转到登录页
router.push('/login');
} else {
setIsInitialized(true);
}
}, [router]);
useEffect(() => {
if (!token) {
router.push('/login');
return;
}
if (!user) {
if (isInitialized && token && !user) {
dispatch(getProfile());
}
}, [token, user, dispatch, router]);
}, [isInitialized, token, user, dispatch]);
// 等待初始化完成
if (!isInitialized) {
return (
<div className="flex min-h-screen items-center justify-center">
<Loader2 className="h-8 w-8 animate-spin text-primary" />
</div>
);
}
if (!token) {
return null;

Some files were not shown because too many files have changed in this diff Show More