Compare commits

...

320 Commits

Author SHA1 Message Date
hailin 16daa7403c fix(mining-admin): 修正Excel列索引
Excel实际格式是:
- 索引0: 序号
- 索引1: 注册ID
- 索引2: 认种量(棵)
- 索引3: 挖矿开始时间
- 索引4: 批次
- 索引5: 授权提前挖的天数
- 索引6: 备注

之前代码从索引0读取用户ID是错误的,现在修正为从索引1开始读取。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 19:58:17 -08:00
hailin ca5de3add1 debug: 添加原始Excel数据日志 2026-01-21 19:46:31 -08:00
hailin 390cc3131d fix(contribution): 修复T2/T3补发记录缺少treeCount和baseContribution
补发奖励时从SyncedAdoption查询原始认种数据,
确保补发记录包含正确的棵数和基础贡献值。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 19:38:18 -08:00
hailin e4c320970f fix(batch-mining): 重构分阶段挖矿计算逻辑
核心修改:
1. 正确理解 preMineDays 的含义:该批次比下一批次提前的天数
2. 新增 totalMiningDays:从挖矿开始日期到今天的总天数
3. 分阶段计算收益:
   - 阶段1: 批次1独挖 (preMineDays1 - preMineDays2) 天
   - 阶段2: 批次1+2共挖 (preMineDays2 - preMineDays3) 天
   - 阶段3: 批次1+2+3共挖 (preMineDays3 - 0) 天
   - 最终阶段: 所有批次共挖 (totalMiningDays - 已用天数) 天
4. 每个阶段按当时的全网算力比例分配收益

示例:
- 批次1 preMineDays=3,批次2 preMineDays=2,批次3 preMineDays=1
- totalMiningDays=74(从11.8到1.21)
- 阶段1: 批次1独挖1天 (3-2=1)
- 阶段2: 批次1+2共挖1天 (2-1=1)
- 阶段3: 批次1+2+3共挖1天 (1-0=1)
- 阶段4: 所有批次共挖71天 (74-3=71)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 18:56:45 -08:00
hailin af95f8da0c fix(mining-admin): 根据挖矿开始时间自动计算挖矿天数
之前错误地从Excel第6列读取preMineDays,但该列为空。
现在根据"挖矿开始时间"到今天自动计算实际挖矿天数。

修改内容:
- 修正Excel列索引(用户ID在第1列,不是第2列)
- 解析日期支持多种格式(2025.11.8, 2025-11-08, 2025/11/8)
- 自动计算从挖矿开始日期到今天的天数作为preMineDays

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 18:50:46 -08:00
hailin 7a5faad665 feat(mining-app): UI调整 - 隐藏部分功能和文案修改
- 兑换页面:将"挖矿账户"改为"分配账户"
- 我的页面:隐藏"账户设置"栏目(2.0版本暂不需要)
- 贡献值页面:隐藏"今日预估收益"栏目
- 贡献值明细页:同伴上贡献值不显示用户ID
- 参与记录页:将"算力"改为"贡献值"

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 18:00:40 -08:00
hailin 8f0fc09a4c fix(mining-admin-web): 解包 mining-admin-service 响应的 TransformInterceptor 包装
mining-admin-service 也使用 TransformInterceptor 将所有响应包装为 { success, data, timestamp } 结构,
前端需要从 res.data.data 中提取实际数据。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 06:34:36 -08:00
hailin 30a82f09f3 fix(mining-admin): 解包 mining-service 响应的 TransformInterceptor 包装
mining-service 使用 TransformInterceptor 将所有响应包装为 { success, data, timestamp } 结构,
mining-admin-service 需要从 result.data 中提取实际数据。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 06:26:56 -08:00
hailin a02813a8ea fix(batch-mining): 修复 mining-admin-service 调用 mining-service API 路径
mining-service 的路由是 /api/v2/admin/batch-mining/...
但 mining-admin-service 调用时缺少 /api/v2 前缀导致 404

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 06:12:32 -08:00
hailin 7a4f5591b7 feat(batch-mining): 添加详细的调试日志
- mining-service batch-mining.service.ts: 添加所有方法的详细日志
- mining-admin-service batch-mining.service.ts: 添加 HTTP 请求和响应日志
- mining-admin-service batch-mining.controller.ts: 添加控制器层日志
- frontend batch-mining page.tsx: 添加前端 console.log 日志

便于调试部署后的 404 等问题

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 05:49:04 -08:00
hailin cb9831f2fc feat(mining-admin-web): 前端实现区域代码到中文名称转换
系统账户列表现在显示中文省市名称(如"广州市公司")而不是区域代码(如"440100账户")

- 新增 region-codes.ts 包含完整的省市行政区划代码映射
- 修改 accounts-table.tsx 使用 getRegionDisplayName() 转换名称
- 修改 account-card.tsx 使用区域代码映射
- 修改账户详情页使用区域代码映射显示标题

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 05:20:50 -08:00
hailin 71151eaabf feat(mining): 添加批量补发挖矿功能
- 新增批量补发服务和API (mining-service)
  - 支持按批次累积计算全网算力
  - 用户算力 = 认种棵数 × 22617 × 70%
  - 补发金额 = (用户算力/全网算力) × 每秒分配量 × 天数 × 86400
  - 防重复执行机制(只能执行一次)

- 新增文件上传和批量补发API (mining-admin-service)
  - 支持上传 Excel 文件解析
  - 预览和执行两步操作
  - 审计日志记录

- 新增批量补发页面 (mining-admin-web)
  - Excel 文件上传
  - 按批次预览计算结果
  - 执行确认对话框

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 04:59:13 -08:00
hailin f7dbe2f62b refactor(contribution): 合并所有迁移到 0001_init
- 将 0002_add_soft_delete 的 deleted_at 字段合并到 0001_init
- 删除 0002_add_soft_delete_to_system_contribution_records 目录
- 现在只保留一个初始化迁移文件

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 04:31:52 -08:00
hailin 21c6c25f7c refactor(contribution): 合并 source_type 迁移到 0001_init
将 0003_add_source_type 迁移合并到 0001_init/migration.sql

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 04:28:14 -08:00
hailin e7260be219 feat(contribution): 添加系统账户算力来源类型字段
- 添加 sourceType 字段区分算力来源类型:
  - FIXED_RATE: 固定比例分配(OPERATION 12%、PROVINCE 1%、CITY 2%)
  - LEVEL_OVERFLOW: 层级溢出归总部(上线未解锁该级别)
  - LEVEL_NO_ANCESTOR: 无上线归总部(该级无上线)
  - BONUS_TIER_1/2/3: 团队奖励未解锁归总部
- 添加 levelDepth 字段记录层级深度(1-15级)
- 更新前端表格显示来源类型列

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 04:23:50 -08:00
hailin e89c3166bf feat(mining-admin-web): 系统账户算力明细显示更多信息
- 显示认种订单:日期、树数
- 显示认种用户:姓名/手机号
- 更新类型定义匹配后端新字段

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 03:36:02 -08:00
hailin 7c8ea7a9d7 feat(mining-admin): 增强系统账户算力明细信息
- 关联认种订单信息:树数、认种日期、状态、单价
- 关联用户信息:手机号(脱敏)、姓名
- 方便追溯每笔算力的来源

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 03:32:27 -08:00
hailin 63aba087b6 feat(mining-admin): 系统账户显示具体省市名称
- 根据 regionCode 从 SyncedProvince/SyncedCity 表查找名称
- PROVINCE + 440000 显示为 "广东省公司"
- CITY + 440100 显示为 "广州市公司"

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 03:31:05 -08:00
hailin 946978f624 fix(mining-admin): 修复 PostgreSQL NULL 唯一约束导致系统账户数据重复问题
- 修改 synced_system_contributions 唯一索引使用 COALESCE 处理 NULL 值
- 修改 handleSystemAccountSynced 和 handleSystemContributionUpdated 方法
  使用 findMany 替代 findFirst,自动清理重复记录

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 03:29:24 -08:00
hailin eeaa43e044 feat(contribution): 系统账户明细记录改为软删除
- 在 SystemContributionRecord 模型添加 deleted_at 字段
- 修改 deleteContributionRecordsByAdoption 方法为软删除(设置 deleted_at)
- 修改 findContributionRecords 方法过滤已删除记录(deletedAt: null)
- 添加数据库迁移文件

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 03:09:41 -08:00
hailin e0eb734196 fix(contribution): 用户领取奖励时从 HEADQUARTERS 减少算力并删除明细
- 添加 subtractContribution 方法减少系统账户算力
- 添加 deleteContributionRecordsByAdoption 方法删除明细记录
- 在 BonusClaimService 中领取奖励时同步更新 HEADQUARTERS
2026-01-21 02:56:58 -08:00
hailin fda022d29c fix(frontend): 添加 regionCode 到 SystemAccount 类型 2026-01-21 02:23:35 -08:00
hailin 974b45554d feat(contribution): 为 HEADQUARTERS 未分配算力创建明细记录
- 每笔未分配算力都创建 HEADQUARTERS 的明细记录
- 发布 SystemContributionRecordCreatedEvent 事件同步到 mining-admin-service
- 明细记录包含来源用户ID (sourceAccountSequence)
2026-01-21 02:20:36 -08:00
hailin 97e974b6da fix(frontend): 添加 regionCode 参数到算力来源 API 调用 2026-01-21 02:13:41 -08:00
hailin 495a1445fd fix(mining-admin): 修复 Prisma 查询 null 值的语法
Prisma 查询 null 值需要使用 { equals: null } 而不是直接 null

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 01:58:08 -08:00
hailin 27a045e082 fix(contribution): 在算力明细事件中添加 regionCode 字段
修改 SystemContributionRecordCreatedEvent 事件,将 systemAccountType
拆分为 accountType 和 regionCode 两个独立字段,以便 mining-admin-service
正确同步按省市细分的算力明细记录

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 01:25:46 -08:00
hailin 6de365e707 fix(mining-admin): 修复 SystemContributionRecordCreated 事件字段映射
contribution-service 使用 systemAccountType 字段发布事件,
mining-admin-service 需要正确映射到 accountType 和 regionCode

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 01:09:44 -08:00
hailin 96da7518bf fix(system-accounts): 修复省市系统账户自动创建的数据流问题
1. contribution-service: 修复 CITY 类型账户的 provinceCode 映射
   - 之前 CITY 的 provinceCode 被错误设为 cityCode
   - 现在正确传递 provinceCode 用于创建省份

2. mining-wallet-service: 修复系统账户创建事件的 topic
   - 之前发布到 mining-wallet.system-account.created
   - 现在发布到 cdc.mining-wallet.outbox 供 mining-admin-service 同步

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 00:28:48 -08:00
hailin cded4b2134 fix(mining-admin): 以算力账户为主显示系统账户列表
修改 getSystemAccounts 方法:
- 以 synced_system_contributions 为主要数据源
- 关联钱包数据和挖矿数据
- 显示所有省市算力账户(而不仅是有钱包的账户)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 23:06:46 -08:00
hailin 86c8ede198 fix(mining-admin): 修复 CDC 事件 eventId 解析问题
mining-wallet-service 发布的事件使用 eventId 字段而不是 id,
导致 normalizeServiceEvent 返回的对象没有 id 属性。

修复:在驼峰格式事件处理中,优先使用 data.id,回退到 data.eventId

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 22:41:55 -08:00
hailin 0a199ae3b5 Revert "fix(mining-admin): 修复 CDC 事件缺少 eventId 的问题"
This reverts commit fff56e8baa.
2026-01-20 22:38:42 -08:00
hailin fff56e8baa fix(mining-admin): 修复 CDC 事件缺少 eventId 的问题
- 在 normalizeServiceEvent 中添加对多种 id 字段的支持
- 当事件缺少 id 时,使用 aggregateId + timestamp 生成备用 ID
- 在 withIdempotency 中添加 event.id 验证,避免创建无效记录
- 修复驼峰格式事件可能没有 id 字段的问题

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 22:09:34 -08:00
hailin 7e61ac7ff2 fix(system-accounts): 修复 Prisma nullable regionCode 复合唯一键查询问题
- 将所有使用 accountType_regionCode 复合键的 findUnique 改为 findFirst
- 将所有 upsert 改为 findFirst + create/update 模式
- 原因:Prisma 复合唯一键不支持 nullable 字段的 findUnique 查询

影响的服务:
- mining-service: admin.controller.ts, system-mining-account.repository.ts
- mining-admin-service: cdc-sync.service.ts, system-accounts.service.ts

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 21:45:13 -08:00
hailin 40ac037c03 fix(contribution): 修复系统账户查询中 nullable regionCode 的 TypeScript 类型错误
## 问题
- Prisma 生成的类型不允许在 unique where 条件中传递 null
- addContribution 方法被传入多余参数
- findByType 返回数组被当作单个对象使用

## 修复
- findByTypeAndRegion: 使用 findFirst 替代 findUnique
- ensureSystemAccountsExist: 使用 findFirst + create 替代 upsert
- addContribution: 使用 findFirst + create/update 替代 upsert
- 修正 HEADQUARTERS 账户同步事件调用参数

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 21:37:44 -08:00
hailin 9062346650 refactor(system-accounts): 移除 baseType 字段,使用 accountType+regionCode 复合唯一键
## 主要变更

### 数据模型简化
- 移除冗余的 baseType 字段,accountType 已包含类型信息
- 使用 accountType (OPERATION/PROVINCE/CITY/HEADQUARTERS) + regionCode (省市代码) 作为复合唯一键
- 所有查询改用 accountType+regionCode,100% 弃用数据库自增 ID

### contribution-service
- SystemAccount 表移除 baseType,改用 accountType+regionCode 唯一约束
- 修改算力分配逻辑,省市账户使用对应 regionCode
- 事件发布增加 regionCode 字段

### mining-service
- SystemMiningAccount 表使用 accountType+regionCode 唯一约束
- API 改为 /system-accounts/:accountType/records?regionCode=xxx 格式
- 挖矿分配逻辑支持按省市细分

### mining-admin-service
- SyncedSystemContribution 表使用 accountType+regionCode 唯一约束
- CDC 同步处理器适配新格式
- API 统一使用 accountType+regionCode 查询

## API 示例
- 运营账户: GET /admin/system-accounts/OPERATION/records
- 广东省: GET /admin/system-accounts/PROVINCE/records?regionCode=440000
- 广州市: GET /admin/system-accounts/CITY/records?regionCode=440100
- 总部: GET /admin/system-accounts/HEADQUARTERS/records

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 21:29:01 -08:00
hailin 81b2e7a4c2 refactor(migrations): 合并各服务的 migration 文件为单个 0001_init
将各服务的多个 migration 文件合并到单一的初始化 migration 中:
- contribution-service: 3→1 (含 region 支持)
- mining-service: 4→1 (含 second 分配和 region 支持)
- mining-admin-service: 4→1 (含 region 和算力明细同步)
- auth-service: 2→1 (含 CDC 幂等)
- trading-service: 9→1 (含销毁系统/做市商/C2C)
- mining-wallet-service: 2→1 (含 SHARE_POOL 拆分)

所有迁移统一使用 TEXT 类型(非 VARCHAR)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 20:34:56 -08:00
hailin 9c816266ac fix(schema): 统一使用 TEXT 类型替代 VARCHAR
问题:
- 之前 schema 和 migration 中使用了 VARCHAR(n) 限制字段长度
- Prisma 的 String 类型默认映射到 PostgreSQL TEXT
- VARCHAR 和 TEXT 在 PostgreSQL 中性能相同,VARCHAR 限制反而增加风险

修复:
1. contribution-service:
   - schema: 移除 accountType/baseType/regionCode/name 的 @db.VarChar
   - migration: VARCHAR -> TEXT

2. mining-service:
   - schema: 移除 accountType/baseType/regionCode/name 的 @db.VarChar
   - migration: VARCHAR -> TEXT

3. mining-admin-service:
   - migration: VARCHAR -> TEXT (schema 已使用 TEXT)

原则:Prisma String 直接使用,不加 @db.VarChar()

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 20:24:01 -08:00
hailin 5f2f223f7b fix(contribution): 修复 SystemAccountSyncedEvent 缺少 baseType/regionCode 参数
问题:
- admin.controller.ts 中 republishSystemAccounts 端点调用 SystemAccountSyncedEvent 时
  只传递了 4 个参数,但构造函数需要 6 个参数
- 缺少 baseType(基础类型)和 regionCode(区域代码)参数

修复:
- 添加 account.baseType 和 account.regionCode 参数
- 与 contribution-calculation.service.ts 中的调用保持一致

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 20:12:49 -08:00
hailin 09b0bc077e feat(system-accounts): 实现系统账户按省市细分算力和挖矿分配
## 核心功能

### 1. 算力按省市细分分配
- accountType 从枚举改为组合键字符串:PROVINCE_440000, CITY_440100
- 新增 baseType (基础类型) 和 regionCode (区域代码) 字段
- 认种时根据 selectedProvince/selectedCity 分配到具体省市账户
- 无省市信息时归入汇总账户

### 2. 系统账户参与挖矿
- 运营、省、市账户按各自 totalContribution 参与挖矿
- 总部账户(HEADQUARTERS)不直接参与,接收待解锁算力收益
- 待解锁算力 100% 参与挖矿,收益归总部

### 3. 算力来源明细追溯
- 新增 SystemContributionRecord 记录每笔算力来源
- 新增 SystemContributionRecordCreatedEvent 事件同步明细
- 前端新增"算力来源"标签页展示明细

## 修改的服务

### contribution-service
- schema: SystemAccount 新增 baseType, regionCode
- contribution-calculator: 按省市生成组合键
- system-account.repository: 支持动态创建省市账户
- 新增 SystemContributionRecordCreatedEvent 事件

### mining-service
- schema: SystemMiningAccount 从枚举改为字符串
- network-sync: 处理带 baseType/regionCode 的同步事件
- mining-distribution: 系统账户和待解锁算力参与挖矿

### mining-admin-service
- schema: 新增 SyncedSystemContributionRecord 表
- cdc-sync: 处理 SystemContributionRecordCreated 事件
- system-accounts.service: 新增算力来源明细和统计 API

### mining-admin-web
- 新增 ContributionRecordsTable 组件
- 系统账户详情页新增"算力来源"标签页
- 显示来源认种ID、用户、分配比例、金额

## 数据库迁移
- contribution-service: 20250120000001_add_region_to_system_accounts
- mining-service: 20250120000001_add_region_to_system_mining_accounts
- mining-admin-service: 20250120000001, 20250120000002

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 19:55:14 -08:00
hailin 5fa0fd5d1a fix(mining): 为 HEADQUARTERS 账户添加每分钟挖矿记录
HEADQUARTERS 的挖矿收益来自待解锁算力,之前只更新了账户余额,
但没有写入 system_mining_records 表的每分钟汇总记录。

现在在两个分发路径中都为 HEADQUARTERS 调用 accumulateSystemMinuteData,
确保前端能正确显示总部账户的挖矿记录。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 17:25:07 -08:00
hailin 1d5e3ebff2 fix(contribution): 使用 upsert 替代 update 避免记录不存在错误
将 addContribution 方法改为 upsert,当系统账户不存在时自动创建,
存在时增加算力余额。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 08:43:37 -08:00
hailin 5ec310124d fix(contribution): 确保 HEADQUARTERS 账户存在后再更新算力
修复 Record to update not found 错误,在调用 addContribution 前
先调用 ensureSystemAccountsExist 确保系统账户记录已创建。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 08:42:02 -08:00
hailin d844228711 fix(contribution): 将未分配算力汇总到总部账户(HEADQUARTERS)
之前 HEADQUARTERS 账户在算力分配时被遗漏,未获得未分配算力的汇总。
现在在保存未分配算力时,同时更新 HEADQUARTERS 账户的 contributionBalance,
并发布同步事件用于 mining-admin-service 同步。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 08:26:53 -08:00
hailin e8e1193387 fix(trading): 添加 original_quantity 数据库迁移文件
修复服务器上缺少 trades.original_quantity 列的问题

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 08:21:29 -08:00
hailin 6c77828944 fix(deploy): 完善 cdc-resnapshot 自动从配置文件创建连接器
- 修复 database.hostname: postgres -> rwa-postgres
- cdc-resnapshot 现在会自动检查连接器是否存在
- 如果连接器不存在,自动从配置文件创建(使用 snapshot.mode=always)
- 修复连接器映射到配置文件的逻辑

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 07:55:19 -08:00
hailin 60f2c29ad8 fix(deploy): 修复 CDC 全量同步问题
问题:
- CDC_CONSUMER_GROUPS 缺少阶段性消费者组,导致 full-reset 时
  未重置 contribution-service-cdc-phase-* 消费者组
- 当 Kafka topic 数据丢失时,无法触发 Debezium 重新快照

修复:
- 添加阶段性消费者组到 CDC_CONSUMER_GROUPS
- 添加 CDC_POSTGRES_CONNECTORS 列表
- 新增 cdc-resnapshot 命令,用于强制 Debezium 重新快照

使用方法:
- ./deploy-mining.sh full-reset      # 完整重置
- ./deploy-mining.sh cdc-resnapshot  # Kafka 数据丢失时触发重新快照

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 07:38:21 -08:00
hailin 5668de0a58 fix(asset): formatCompact保留原始精度(最多8位小数)
修复资产页面累计卖出等数字显示没有小数位的问题

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 05:27:50 -08:00
hailin 995dfa898e feat(trading): 添加涨跌幅显示及修复成交明细数据
1. 后端:
   - 添加 getFirstSnapshot() 获取上线首日价格
   - PriceInfo 接口增加 priceChangePercent 和 initialPrice 字段
   - 计算涨跌幅 = (当前价格 - 首日价格) / 首日价格 × 100%
   - 修复 originalQuantity 为0时的数据计算逻辑

2. 前端:
   - 交易页面涨跌幅移到价格下方单独显示
   - 添加"较上线首日"说明文字
   - 根据涨跌正负显示不同颜色和图标

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 05:24:23 -08:00
hailin 7ff7157115 refactor(ui): 隐藏"我的"页面中的"同伴与收益"栏
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 04:44:16 -08:00
hailin e1cc364b0d refactor(ui): 兑换页面文案调整
- 标题"积分股交易"改为"积分股兑换"
- "当前积分股价格"改为"当前积分股价值"
- 移除价格前的¥符号

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 04:43:08 -08:00
hailin 93c06920bd refactor(ui): 引荐人数改为引荐
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 04:32:48 -08:00
hailin 9fb51fa30a fix(ui): 恢复同伴下贡献值统计卡片的两行布局
- 恢复"已解锁上"项(去掉"级"字)
- 保持两行四项布局:引荐人数、已解锁上、已解锁下、是否参与

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 04:31:54 -08:00
hailin 33bf14b225 refactor(ui): 调整同伴下贡献值统计卡片
- 删除"已解锁上级"项
- "引荐人数"去掉"人"单位
- "已解锁下级"改为"已解锁下"
- 三个统计项改为单行显示

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 04:29:40 -08:00
hailin 728497afc1 refactor(ui): 将"个人"替换为"本人"
- team_tree_widget.dart: 节点详情弹窗中"个人参与"改为"本人参与"
- team_page.dart: 统计卡片中"个人参与"改为"本人参与"
- contribution_page.dart: 贡献值统计中"个人贡献值"改为"本人贡献值","个人参与产生的贡献值"改为"本人参与产生的贡献值"
- contribution_records_page.dart: 筛选标签"个人"改为"本人",来源类型标签"个人参与"改为"本人参与"

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 04:28:10 -08:00
hailin 9c705d7478 fix(ui): 将UI界面上所有"团队"替换为"同伴"
- team_tree_widget: 团队参与 → 同伴参与
- contribution_records_page: 团队下/上贡献值 → 同伴下/上贡献值
- about_page: 团队收益 → 同伴收益
- help_center_page: 团队收益/计算 → 同伴收益/计算
- contribution_page: 团队下/上贡献值 → 同伴下/上贡献值
- team_page: 我的团队 → 我的同伴, 团队参与 → 同伴参与
- profile_page: 团队与收益 → 同伴与收益, 我的团队 → 我的同伴
- contribution_record: 团队奖励/额外奖励 → 同伴奖励/额外奖励

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 04:24:54 -08:00
hailin 21e536d829 fix(ui): 移除所有人民币符号,统一使用积分值单位
- asset_page: 总资产估值改为显示"积分值"后缀
- asset_page: 估值占位符移除¥符号
- c2c_publish_page: 交易总额、单价、总额改为积分值单位

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 04:19:52 -08:00
hailin 14d29b62ef fix(asset): 隐藏总资产卡片的每秒增长显示
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 04:16:42 -08:00
hailin 1aa655f243 fix(asset): 移除积分值冗余的人民币估值显示
积分值与人民币1:1对应,不需要显示"≈ ¥xxx"

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 04:15:11 -08:00
hailin 8728fdce4c feat(trading): 成交明细显示完整卖出信息(销毁倍数、有效积分股、手续费等)
- 后端Trade表新增originalQuantity字段存储原始卖出数量
- quantity字段改为存储有效积分股(含销毁倍数)
- API返回完整明细:销毁倍数、有效积分股、交易总额、进入积分股池
- 前端成交明细页面显示完整卖出信息,类似确认卖出弹窗样式

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 04:07:52 -08:00
hailin 7da98c248b feat(trading): 交易记录页面添加成交明细Tab,显示手续费
后端:
- trading-service 添加 GET /trading/trades API 获取成交记录
- 成交记录包含: 交易总额、手续费(10%)、实际收到金额

前端:
- 新增 TradeRecord 实体和 TradesPageModel
- 交易记录页面添加 Tab: "订单记录" 和 "成交明细"
- 成交明细显示: 价格、数量、交易总额、手续费、实际收到

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 03:15:25 -08:00
hailin 63e02666ea fix(asset): 积分股估值计算加入倍数(burnMultiplier)
之前只用了 shareBalance * currentPrice,
正确公式应该是 shareBalance * (1 + burnMultiplier) * currentPrice

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 03:07:03 -08:00
hailin 1c787a22a3 fix(mining): 修复 mining-service 订阅错误的 Kafka topic
问题:mining-service 订阅的是 cdc.contribution.outbox (Debezium CDC topic),
但 contribution-service 使用 Outbox Pattern 直接发送到 contribution.{eventType} topic。

修复:
- mining-service 订阅正确的 topic 列表
- 修复消息解析逻辑支持 Outbox Pattern 消息格式
- contribution-service 添加 GET /admin/unallocated-contributions 端点(调试用)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 03:03:17 -08:00
hailin 0fddd3164a fix(trading): 交易记录金额单位从USDT改为积分值
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 02:18:40 -08:00
hailin b1d8561ca5 fix(asset): 积分股估值显示积分值而非人民币
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 02:17:22 -08:00
hailin edfdb1a899 fix(asset): 冻结积分值为0时状态显示为"无"而非"挂单中"
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 02:16:35 -08:00
hailin 94d283696f fix(tss): 修复备份恢复后签名失败的问题
问题原因:
- 备份恢复的钱包在签名时使用了当前设备的 partyId,而不是原始 keygen 时的 partyId
- TSS 协议要求签名时使用的 partyId 必须与 keygen 时完全一致

修复内容:
- Android: joinSignSessionViaGrpc() 使用 shareEntity.partyId 而非当前设备 partyId
- Electron: cosign:joinSession 和 cosign:createSession 使用 share.party_id
- Electron: handleCoSignStart() 使用 share.party_id 进行签名
- 所有 gRPC 通信和消息订阅都使用原始 partyId

关键修改点:
- TssRepository.kt: joinSignSessionViaGrpc() 第 1136 行使用 signingPartyId
- main.ts: cosign:joinSession 第 1826 行使用 signingPartyId
- main.ts: cosign:createSession 第 1624-1633 行使用 share.party_id
- main.ts: handleCoSignStart() 第 836 行使用 share.party_id

其他:
- 移除 Android APK 中的 x86_64 ABI (仅用于模拟器)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 00:39:05 -08:00
hailin c5db77d23a feat(tss): 添加积分股(eUSDT)和积分值(fUSDT)代币支持
新增功能:
- 添加 ENERGY_POINTS (积分股/eUSDT) 和 FUTURE_POINTS (积分值/fUSDT) 代币类型
- 实现所有 ERC-20 代币的通用余额查询功能
- 支持四种代币的转账功能 (KAVA, dUSDT, eUSDT, fUSDT)
- 更新 UI 显示所有代币余额和代币选择器

代币合约地址 (Kava EVM):
- dUSDT (绿积分): 0xA9F3A35dBa8699c8E681D8db03F0c1A8CEB9D7c3
- eUSDT (积分股): 0x7C3275D808eFbAE90C06C7E3A9AfDdcAa8563931
- fUSDT (积分值): 0x14dc4f7d3E4197438d058C3D156dd9826A161134

技术改进:
- 添加 TokenConfig 工具类统一管理代币配置
- 添加 ERC20Selectors 常量类定义合约方法选择器
- 添加 transaction_records 表用于存储转账历史 (数据库版本升级到 v4)
- 重构余额查询和转账逻辑支持多代币类型
- 所有 ERC-20 代币使用 6 位小数精度

受影响文件:
- Android: Models.kt, TssRepository.kt, TransactionUtils.kt, Database.kt,
          AppModule.kt, TransferScreen.kt, WalletsScreen.kt
- Electron: transaction.ts, Home.tsx

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 23:36:58 -08:00
hailin d332ef99a7 feat(auth): 隐藏验证码登录和注册功能
- 移除登录方式切换(验证码登录选项卡)
- 隐藏注册入口
- 简化登录页面,仅保留密码登录
- 清理未使用的变量和方法

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 23:10:24 -08:00
hailin d31bfc4221 feat(c2c): 修改C2C交易为积分值并添加绿积分支付方式
- C2C交易的商品从积分股改为积分值
- 添加绿积分作为默认且不可取消的支付方式
- 添加1.0系统ID输入框(绿积分支付必填)
- 支持多种支付方式同时选择
- 更新交易提示说明

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 23:07:35 -08:00
hailin 9333cd81c3 fix(kline): auto-load more history when data doesn't fill screen
- Always assume hasMoreHistory=true on initial load
- Auto-trigger loadMoreHistory when klines don't fill drawable area
- This ensures sparse data periods load all available history

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 22:53:48 -08:00
hailin 84d920f98f fix(kline): stop loading when no unique new history data
When deduplicated new klines are empty, mark hasMoreHistory as false
to prevent infinite loading attempts.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 22:31:38 -08:00
hailin 13f1b687ee feat(kline): add dynamic history loading on pan
Add support for loading more K-line history data when user pans to the
left edge. Backend API now accepts 'before' parameter for pagination.
Frontend uses KlinesNotifier to manage accumulated data with proper
deduplication.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 22:21:39 -08:00
hailin 99c1ff1fb7 fix(kline): convert time to local timezone for display
- Added toLocal() conversion in _formatTimeLabel (painter)
- Added toLocal() conversion in _formatDateTime (widget)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 21:55:56 -08:00
hailin 900ba4a555 feat(kline): add dynamic X-axis time labels
- Added _drawTimeAxis method to render time labels
- Labels dynamically adjust spacing based on candleWidth
- Shows HH:MM format, or M/D for midnight
- Labels follow pan/zoom movements
- Increased bottomPadding to 20px for label space

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 21:51:23 -08:00
hailin 453cab71e4 chore: remove debug logging from kline pan
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 21:49:34 -08:00
hailin f55fb13f26 feat(kline): implement true pan effect with scrollOffset
- Added scrollOffset parameter to KlinePainter and KlineVolumePainter
- Painters now draw all klines and apply scrollOffset for positioning
- Added canvas clipping to prevent drawing outside chart bounds
- Removed _getVisibleData and related helper methods (no longer needed)
- scrollOffset directly controls the visual position of all klines

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 21:43:45 -08:00
hailin 48ba72ce89 fix(kline): simplify pan logic - scrollX now always controls view position
- Removed _userHasPanned flag and special handling
- _scrollX is initialized in _scrollToCenter() at startup
- Pan gesture directly modifies _scrollX
- _getVisibleData() always uses _scrollX to calculate visible range

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 21:35:31 -08:00
hailin 7ae58e98e6 debug: 添加平移调试信息 2026-01-19 21:30:41 -08:00
hailin 684367941d feat(kline): 支持左右平移查看历史K线
- 初始状态:最新K线在屏幕中心(保持原有逻辑不变)
- 用户平移后可查看全部历史数据
- 切换周期时重置平移状态

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 21:25:59 -08:00
hailin f149c2a06a fix(kline): 数据量大时只取最近数据,让最新K线在屏幕中心
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 20:39:33 -08:00
hailin a15ab7600f fix(kline): 修正K线显示逻辑,从左开始排列,最新K线居中
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 20:26:20 -08:00
hailin f51aa44cd9 feat(frontend): K线图组件支持深色模式
- kline_chart_widget.dart: 使用 AppColors 动态颜色,传递 isDark 参数
- kline_painter.dart: 添加 isDark 参数,网格/文字/十字线颜色随主题变化
- kline_volume_painter.dart: 添加 isDark 参数,成交量图颜色随主题变化

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 19:42:52 -08:00
hailin 2745995a1a feat(frontend): 贡献值页面完整支持深色模式
- 所有卡片背景使用 AppColors.cardOf(context) 动态颜色
- 所有文字颜色改用动态方法 textPrimaryOf/textSecondaryOf/textMutedOf
- 图标背景透明度根据深色模式调整 (isDark ? 0.2 : 0.1)
- 分隔线和边框使用 AppColors.borderOf(context)
- 移除未使用的静态颜色常量

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 19:32:06 -08:00
hailin 61203d1baf feat(frontend): extend dark mode support to more pages
- Update asset_page.dart with full dark mode support
- Update c2c_market_page.dart with full dark mode support
- Update login_page.dart with full dark mode support

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 19:21:40 -08:00
hailin b0d1771b66 fix(contribution): 恢复静态颜色常量避免编译错误
保留原有浅色模式配色(_grayText, _darkText, _bgGray, _lightGray),
页面背景已支持深色模式切换

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 19:10:13 -08:00
hailin bbe1754309 feat(frontend): 添加全局深色模式支持
核心改动:
1. app_colors.dart - 添加浅色/深色两套色彩常量和动态颜色获取方法
   - backgroundOf()、surfaceOf()、cardOf() 等方法根据主题返回对应颜色
   - 浅色:#F3F4F6背景、#FFFFFF卡片、#1F2937文字
   - 深色:#111827背景、#1F2937卡片、#E5E7EB文字

2. main.dart - 更新 ThemeData 配置
   - 添加 scaffoldBackgroundColor、appBarTheme、cardTheme 等深色主题配置

3. main_shell.dart - 导航栏支持深色模式
   - 使用 AppColors 动态方法替换硬编码颜色

4. trading_page.dart - 兑换页面支持深色模式
   - 所有卡片、文字颜色使用动态颜色方法
   - 划转弹窗也支持深色模式

5. contribution_page.dart - 贡献值页面开始支持(部分)

后续需要继续更新其他页面以完整支持深色模式切换

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 19:08:03 -08:00
hailin a47b935bce fix(tss-android): 修复备份恢复后无法签名的问题
问题原因:
备份数据中缺少 partyId 字段。恢复到新手机后,签名时使用的是新设备
生成的 partyId,而不是 keygen 时编码到 LocalPartySaveData 中的
原始 partyId,导致 TSS 签名协议无法正确匹配密钥数据而失败。

修复内容:
1. Models.kt:
   - ShareRecord 添加 partyId 字段
   - ShareBackup 添加 partyId 字段,备份格式版本升级到 v2
   - 更新 fromShareRecord() 和 toShareRecord() 方法

2. Database.kt:
   - ShareRecordEntity 添加 party_id 列
   - 数据库版本升级到 3

3. AppModule.kt:
   - 添加 MIGRATION_2_3 数据库迁移脚本

4. TssRepository.kt:
   - 添加 currentSigningPartyId 成员变量跟踪当前签名使用的 partyId
   - keygen 保存时包含 partyId (3处)
   - 备份导入时保存原始 partyId
   - 签名流程使用 shareEntity.partyId 替代设备 partyId (3处)
   - gRPC 调用 (markPartyReady, reportCompletion) 使用原始 partyId

关键点: 签名时必须使用 keygen 时的原始 partyId,因为该 ID 被编码
到了 TSS 密钥数据结构中。现在备份会保存此关键字段,恢复后签名
将使用正确的 partyId。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 18:56:27 -08:00
hailin b00de68b01 refactor(profile): 重新实现深色模式,保留原有浅色模式配色
问题:之前的深色模式实现使用了 colorScheme,导致浅色模式的原有配色也被改变

修复方案:
- 定义浅色/深色两套色彩常量,浅色模式使用原有设计色值
- 添加基于 Theme.brightness 的颜色辅助函数动态返回对应颜色
- 浅色模式:保持原有 #1F2937/#6B7280/#9CA3AF/#F3F4F6/白色配色
- 深色模式:使用 #E5E7EB/#9CA3AF/#6B7280/#111827/#1F2937 配色
- 所有 widget 方法更新为使用动态颜色函数

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 18:44:31 -08:00
hailin d8df50a68f fix(c2c): 修复分页参数类型转换问题导致的500错误
- 为 QueryC2cOrdersDto 和 QueryMyC2cOrdersDto 的 page/pageSize 字段添加 @Type(() => Number) 装饰器
- Query参数从URL获取时默认为字符串,需要显式转换为数字类型
- 添加 @IsInt() 验证确保参数为整数
- 修复 Prisma findMany take 参数期望 Int 但收到 String 的错误

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 18:32:38 -08:00
hailin 63c192e90d feat(pending-contributions): 添加待解锁算力分类账功能
功能说明:
- 待解锁算力是因用户未满足解锁条件(如直推数不足)而暂存的层级/奖励算力
- 这部分算力参与挖矿,但收益归入总部账户(HEADQUARTERS)

后端变更:
- mining-service: 添加4个待解锁算力相关API
  - GET /admin/pending-contributions - 获取待解锁算力列表(支持分页和类型筛选)
  - GET /admin/pending-contributions/summary - 获取汇总统计(按类型统计、总挖矿收益)
  - GET /admin/pending-contributions/:id/records - 获取单条记录的挖矿明细
  - GET /admin/pending-contributions/mining-records - 获取所有挖矿记录汇总视图
- mining-admin-service: 添加代理层
  - 新建 PendingContributionsService 调用 mining-service API
  - 新建 PendingContributionsController 暴露 API 给前端

前端变更:
- 新建 pending-contributions feature 模块(API、hooks、类型定义)
- 新建 /pending-contributions 页面
  - 汇总统计卡片(总量、已归入总部积分股、挖矿记录数)
  - 按类型统计展示
  - 算力列表Tab(来源用户、归属用户、类型、算力、原因、创建时间)
  - 挖矿记录Tab(时间、类型、算力、占比、每秒分配量、挖得数量、归入)
- 在仪表盘的层级算力和团队奖励卡片添加"查看分类账"链接

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 18:13:22 -08:00
hailin d815792deb feat(kline): 默认隐藏MA均线指标
- 将主图指标默认值从 0(MA) 改为 -1(无)
- 首次加载K线图时不再显示 MA5/MA10/MA20/MA60 均线
- 保留指标切换功能,用户可手动开启 MA/EMA/BOLL 指标
- 使K线图界面更简洁清爽

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 18:05:30 -08:00
hailin a97e0b51b8 feat(kline): 首次加载时让最新K线居中显示
- 新增 _scrollToCenter() 方法,计算让最新K线居中的滚动位置
- 初始化时调用 _scrollToCenter() 替代 _scrollToEnd()
- 如果K线总宽度小于屏幕宽度,不滚动,从左开始显示
- 保留 _scrollToEnd() 方法供刷新按钮等场景使用

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 18:03:23 -08:00
hailin 8326f8c35c fix(cdc): 添加 Debezium heartbeat 机制防止 WAL 堆积
问题背景:
- PostgreSQL pg_wal 目录从 80MB 膨胀到 60.4GB,导致磁盘使用率达到 96%
- 根因: wallet/planting/referral 三个数据库的业务表长期无写入
- 虽然 Debezium 有 heartbeat 配置,但未配置 heartbeat.action.query
- 导致 replication slot 的 restart_lsn 无法推进,WAL 文件无法被清理

解决方案:
1. 在 wallet/planting/referral 三个服务中添加 debezium_heartbeat 表
2. 配置 Debezium connector 的 heartbeat.action.query
3. 每 60 秒自动执行 UPDATE 语句推进 restart_lsn

修改内容:
- wallet-service/prisma/schema.prisma: 添加 DebeziumHeartbeat model
- planting-service/prisma/schema.prisma: 添加 DebeziumHeartbeat model
- referral-service/prisma/schema.prisma: 添加 DebeziumHeartbeat model
- scripts/debezium/wallet-connector.json: 添加 heartbeat.action.query 配置
- scripts/debezium/planting-connector.json: 添加 heartbeat.action.query 配置
- scripts/debezium/referral-connector.json: 添加 heartbeat.action.query 配置
- 新增三个服务的 Prisma migration 文件

效果:
- pg_wal 从 60.4GB 降至 80.2MB
- 磁盘使用率从 96% 降至 40%
- replication slot lag 从 51-60GB 降至 KB 级别

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 17:42:41 -08:00
hailin 964b06b370 fix(trading): 修复"我的挂单"全部按钮点击无响应问题
问题原因:
- trading_page.dart 中"我的挂单"卡片的"全部 >"按钮 onTap 回调为空(TODO 注释)
- 用户点击时没有任何响应

修复内容:
- 添加 context.push(Routes.tradingRecords) 导航到交易记录页面
- 用户现在可以点击"全部 >"查看完整的交易订单列表

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 07:23:38 -08:00
hailin af339b19b9 feat(c2c): 完善C2C场外交易功能 - 收款信息与订单超时处理
## 后端更新

### Prisma Schema (0008_add_c2c_orders migration)
- 新增 C2cPaymentMethod 枚举 (ALIPAY/WECHAT/BANK)
- C2cOrder 模型新增字段:
  - 收款信息:paymentMethod, paymentAccount, paymentQrCode, paymentRealName
  - 超时配置:paymentTimeoutMinutes (默认15分钟), confirmTimeoutMinutes (默认60分钟)
  - 截止时间:paymentDeadline, confirmDeadline
- 新增索引优化超时查询

### API层
- c2c.dto.ts: 新增收款信息和超时配置字段
- c2c.controller.ts: 新增C2C控制器,支持完整的订单生命周期管理

### 业务层
- c2c.service.ts:
  - createOrder: 卖单必须提供收款信息验证
  - takeOrder: 接单时自动设置付款截止时间
  - confirmPayment: 确认付款时设置确认收款截止时间
  - processExpiredOrders/expireOrder: 处理超时订单(释放冻结资产)
- c2c-expiry.scheduler.ts: 每分钟执行超时订单检查(带分布式锁)

### 数据层
- c2c-order.repository.ts: 新增 findExpiredOrders 方法
- trading-account.repository.ts: 新增 unfreezeShares/unfreezeCash 方法

## 前端更新

### 数据模型
- c2c_order_model.dart:
  - 新增 C2cPaymentMethod 枚举
  - 新增收款信息和超时相关字段
  - 新增辅助方法:paymentMethodText, hasPaymentInfo, paymentRemainingSeconds, confirmRemainingSeconds

### API层
- trading_remote_datasource.dart: createC2cOrder/takeC2cOrder 支持收款信息参数

### 状态管理
- c2c_providers.dart: createOrder/takeOrder 方法支持收款信息参数

### UI层
- c2c_publish_page.dart:
  - 新增收款方式选择器 (支付宝/微信/银行卡)
  - 新增收款账号和收款人姓名输入框
  - 卖单发布时验证收款信息必填
  - 确认对话框显示收款信息摘要

- c2c_order_detail_page.dart:
  - 新增收款信息卡片展示(买家视角/卖家视角区分)
  - 新增倒计时进度条显示(付款/确认收款截止时间)
  - 剩余时间<5分钟时高亮警告
  - 支持复制收款账号

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 07:17:22 -08:00
hailin 928d6c8df2 refactor(frontend): 优化K线图显示与交互体验
主要改动:
1. 隐藏技术指标 - 暂时隐藏MA/EMA/BOLL主图指标和MACD/KDJ/RSI副图指标
   - 保留全部代码,便于未来恢复(取消注释即可)
   - 调整高度分配:主图75%、成交量25%

2. 修复单指滑动(pan)问题
   - 移除错误的scale阈值检测 `(details.scale - 1.0).abs() > 0.01`
   - 改用 `pointerCount > 1` 区分单指滑动和双指缩放
   - 单指滑动现可正常左右拖动K线图

3. 优化首次加载显示
   - 新增 `_initialized` 标志控制初始化时机
   - 新增 `_initializeCandleWidth()` 方法动态计算K线宽度
   - K线首次加载时自动填满可视区域宽度
   - 数据量变化时自动重新初始化

技术细节:
- 使用 LayoutBuilder 获取实际图表宽度后再初始化
- 通过 postFrameCallback 确保在布局完成后执行初始化
- K线宽度限制在 3.0-30.0 像素范围内

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 06:31:24 -08:00
hailin 7fb77bcc7e refactor(frontend): 暂时隐藏"我的"页面中的"团队与收益"栏目
- 注释掉 _buildTeamEarningsSection 调用
- 保留代码方便后续恢复

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 06:19:54 -08:00
hailin f7cfb4ef8c refactor(frontend): 优化参与记录和贡献值页面的文案显示
参与记录页面(planting_records_page.dart):
- 将状态 "已开始挖矿" 改为 "已开始"
- 将 "xx棵" 改为 "xx个"
- 将 "单棵算力" 改为 "单个算力"

贡献值页面(contribution_page.dart):
- 将贡献值明细中的 "本人种植" 改为 "本人"
- 移除 "已解锁上级" 和 "已解锁下级" 后的 "级" 字后缀

实体类(planting_record.dart):
- 同步更新状态显示文案 "已开始挖矿" → "已开始"

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 06:17:19 -08:00
hailin d957e5a841 feat(admin): 系统账户添加已挖积分股显示和分类账功能
## 后端改动

### mining-service
- 新增 GET /admin/system-accounts/:accountType/records - 获取系统账户挖矿记录(分钟级)
- 新增 GET /admin/system-accounts/:accountType/transactions - 获取系统账户交易记录

### mining-admin-service
- 添加 @nestjs/axios 依赖用于 HTTP 调用
- 修改 SystemAccountsService,通过 HTTP 调用 mining-service 获取挖矿数据(totalMined, availableBalance)
- 新增挖矿记录和交易记录的代理 API

## 前端改动

### 类型定义
- SystemAccount 新增 totalMined, availableBalance, miningContribution, miningLastSyncedAt 字段

### API 层
- 新增 getMiningRecords 和 getTransactions API 方法
- 新增 SystemMiningRecord, SystemTransaction 等类型定义

### Hooks
- 新增 useSystemAccountMiningRecords 和 useSystemAccountTransactions

### 组件
- AccountsTable 新增"已挖积分股"列,显示每个系统账户累计挖到的积分股
- AccountsTable 新增"分类账"按钮,可跳转到账户详情页

### 新页面
- 新建 /system-accounts/[accountType] 详情页面
  - 账户概览卡片:当前算力、已挖积分股、可用余额、挖矿记录数
  - 挖矿记录 Tab:分钟级挖矿明细(时间、算力占比、全网算力、每秒分配量、挖得数量)
  - 交易记录 Tab:所有交易流水(时间、类型、金额、交易前后余额、备注)
  - 支持分页浏览

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 05:53:03 -08:00
hailin 07498271d3 feat(blockchain): 部署 eUSDT 和 fUSDT 代币合约
新增两个 ERC-20 代币合约,部署在 KAVA 主网:

## eUSDT (Energy USDT)
- 合约地址: 0x7C3275D808eFbAE90C06C7E3A9AfDdcAa8563931
- 总供应量: 100.02亿 (10,002,000,000)
- 交易哈希: 0x5bebaa4a35378438ba5c891972024a1766935d2e01397a33502aa99e956a6b19

## fUSDT (Future USDT)
- 合约地址: 0x14dc4f7d3E4197438d058C3D156dd9826A161134
- 总供应量: 1万亿 (1,000,000,000,000)
- 交易哈希: 0x071f535971bc3a134dd26c182b6f05c53f0c3783e91fe6ef471d6c914e4cdb06

## 共同特性
- 固定供应量,不可增发
- 6位小数精度(与USDT一致)
- 标准ERC-20接口
- 部署者: 0x4F7E78d6B7C5FC502Ec7039848690f08c8970F1E

## 文件结构
- eUSDT/: 合约源码、编译脚本、部署脚本、README
- fUSDT/: 合约源码、编译脚本、部署脚本、README
- contracts/README.md: 补充dUSDT说明文档

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 05:30:25 -08:00
hailin 8619b0bf26 feat(frontend): add dark mode support and fix mining records overflow
- Add darkTheme configuration in main.dart with themeModeProvider
- Refactor profile_page.dart to use Theme.of(context).colorScheme for dynamic theming
- Fix mining_records_page.dart layout overflow by using Expanded/Flexible widgets

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 05:06:28 -08:00
hailin 75e74b07c3 refactor(frontend): hide multiplier-related UI elements in asset page
- Comment out '积分股(含倍数)' display
- Comment out '含倍数资产' tag

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 04:56:07 -08:00
hailin e098cd44f6 refactor(frontend): replace '手机号' with '账号' in send shares page
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 04:54:24 -08:00
hailin 71a9961f94 refactor(frontend): update UI text labels
- Change contribution page subtitle to '股行用户引荐'
- Change receive shares page text from '手机号' to '账号'

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 04:52:38 -08:00
hailin 5ea8d8fea5 docs(frontend): update about page description
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 04:47:05 -08:00
hailin 1c9bb1aa60 refactor(frontend): rename '团队上级/下级' to '团队上/下贡献值'
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 04:45:15 -08:00
hailin 747e8bfee1 refactor(frontend): replace all '直推' with '引荐'
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 04:43:31 -08:00
hailin 1efe39c6bd refactor(frontend): replace all '认种' with '参与'
- Update terminology across all pages and entities
- Change '认种' to '参与' in user-facing text
- Update comments and documentation

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 04:39:15 -08:00
hailin e48bf3e81f refactor(frontend): replace all '榴莲' references with '股行'
- Update app name and branding across all pages
- Remove references to '榴莲树' in contribution and adoption contexts
- Update user-facing text in login, register, splash pages
- Update FAQ content in help center and about pages

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 04:33:47 -08:00
hailin d9d46065e0 fix(asset): 总资产估值包含积分值余额
总资产估值 = 积分股价值 + 积分值余额
- 积分股价值 = 积分股 × (1 + burnMultiplier) × 价格
- 积分值余额 = 可用积分值 + 冻结积分值

之前只计算了积分股价值,现在加上了积分值(现金)余额。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 04:26:21 -08:00
hailin d4f7cd834a fix(android): 添加 strings.xml 解决部分手机 app 名称显示问题
某些 Android 手机会从 strings.xml 读取 app 名称而非 AndroidManifest.xml,
导致显示旧名称。通过添加 strings.xml 并引用它来解决兼容性问题。

- 添加 values/strings.xml 定义 app_name
- 添加 values-zh/strings.xml 支持中文系统
- AndroidManifest.xml 改为引用 @string/app_name

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 04:15:18 -08:00
hailin 7df57b9de5 fix(trading): 修复卖出预估计算,加入销毁倍数
卖出预估计算现在使用与后端一致的公式:
- 有效积分股 = 卖出量 × (1 + burnMultiplier)
- 卖出交易额 = 有效积分股 × 价格 × 0.9

修复的位置:
1. _calculateEstimate() 方法 - 预计获得显示
2. _handleTrade() 确认弹窗 - 显示销毁倍数和有效积分股明细

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-19 04:10:32 -08:00
hailin 6109bf4584 fix(kong): remove ws/wss protocols from WebSocket route
Kong 会自动处理 HTTP -> WebSocket 升级,route protocols 只需要 http/https

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 21:36:30 -08:00
hailin 94153058d8 chore(mining-app): remove WebSocket debug logs
移除前端 WebSocket 相关的调试日志

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 21:34:07 -08:00
hailin c05bcc9a76 feat(trading): 实现10%交易手续费进入积分股池
- 在成交时从卖方收益中扣除10%手续费
- 手续费流入积分股池(greenPoints/200万账户)
- 添加详细分类账记录,包含买卖双方账户和来源标注
- Trade表新增fee字段记录每笔交易的手续费

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 21:33:40 -08:00
hailin 192e2551bf feat(trading): 资产页面实时价格 WebSocket 推送
## 后端变更
- 添加 @nestjs/websockets, @nestjs/platform-socket.io, socket.io 依赖
- 新增 PriceGateway (price.gateway.ts): WebSocket 网关,namespace /price
- 新增 PriceBroadcastScheduler: 每秒广播价格更新到所有连接的客户端
- 更新 ApiModule 和 ApplicationModule 注册新模块

## Kong API Gateway
- 添加 WebSocket 路由: /ws/price -> trading-service:3022/price
- 支持 ws/wss 协议

## 前端变更
- 添加 socket_io_client 依赖
- 新增 PriceWebSocketService: 带自动断线重连机制的 WebSocket 服务
  - 指数退避重连策略 (1s -> 30s)
  - 最大重连次数 10 次
  - 连接状态流监听
- 资产页面集成 WebSocket:
  - initState 时连接,dispose 时断开
  - 实时更新价格和销毁倍数
  - 保持原有的每秒积分股增长计算

## 调试日志
- 前后端都添加了详细的调试日志方便排查问题
- 日志前缀: [PriceWS], [AssetPage], [PriceGateway], [PriceBroadcastScheduler]

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 21:18:34 -08:00
hailin f6458dd12e fix(trading): 做市商吃单间隔从1-4秒改为固定1秒
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 21:00:21 -08:00
hailin 533ad3ba82 feat(mining-app): 兑换页面价格改为只读,使用实时价格
- 价格输入框改为只读模式,用户不能修改
- 始终使用后端返回的实时价格
- 价格会随实时刷新自动更新

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 20:56:59 -08:00
hailin cfa3979a97 revert: 恢复资产页面的倍数显示和计算
- 资产显示值恢复为 积分股余额 × (1 + burnMultiplier) × price
- 显示有效积分股(含倍数)而非实际积分股数量
- 实时计算也考虑倍数因子

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 20:46:57 -08:00
hailin 07247fe05f fix: 将划转最小限制从5改为0.01积分股
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 19:29:05 -08:00
hailin dcf413fb72 fix(mining): perSecondEarning只在挖矿激活时返回非零值
- GetMiningAccountQuery 检查 config.isActive 状态
- 挖矿未激活时 perSecondEarning 返回 0
- 前端资产页面定时器会因此停止增长

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 08:32:54 -08:00
hailin b7c8cdd249 fix(trading): 销毁和快照只在交易系统激活时执行
- BurnScheduler 检查 trading_configs.isActive 状态
- 交易系统未激活时跳过每分钟销毁和价格快照
- 交易系统未激活时跳过每小时状态日志

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 08:28:48 -08:00
hailin 096d87e2a8 fix(trading): 区分买方支付和卖方收款金额
问题:executeBuy使用含销毁倍数的tradeAmount,但买方冻结的是原始金额
原因:买方支付=原始数量×价格,卖方收款=有效数量×价格(含销毁)
修复:
- buyerPayAmount = tradeQuantity × price(买方实际支付)
- sellerReceiveAmount = effectiveQuantity × price(卖方实际收款)
- executeBuy 使用 buyerPayAmount
- executeSell 使用 sellerReceiveAmount

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 08:08:58 -08:00
hailin 64b9dcb6c7 fix(trading): 修复订单撮合时buyOrderId为null的问题
问题:在createOrder中调用tryMatch时,传入的order对象没有id
原因:orderRepository.save()返回orderId但没有更新到order对象
解决:保存后重新从数据库获取订单,确保有id再进行撮合

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 07:44:44 -08:00
hailin 2154d5752f fix: 修复K线图NaN错误并添加mining-service划转端点
- 修复K线图当价格范围为0时导致的NaN Offset错误
- 在mining-service添加transfer-out和transfer-in端点
- 划转操作会在mining_transactions表中记录明细

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 07:12:32 -08:00
hailin 4e181354f4 feat(frontend): 添加积分股划转功能
- 交易页面添加划转入口链接
- 实现双向划转弹窗(划入交易账户/划出到挖矿账户)
- 新增划转历史记录页面
- 添加划转相关 API 调用

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 06:57:23 -08:00
hailin 1760f9b82c fix(trading): 为所有 DTO 添加 class-validator 装饰器
修复 trading.controller、admin.controller、transfer.controller
的 DTO 验证问题,解决 App 下单时 400 错误。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 06:14:06 -08:00
hailin dd011c13d4 fix(mining-admin-web): 修复 Kong 网关响应数据解包
在 axios 拦截器中统一处理 Kong 网关的数据包装层,
解决做市商配置页面初始化后界面不刷新的问题。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 05:55:06 -08:00
hailin edd6ced2a3 fix(trading): 为做市商 DTO 添加 class-validator 装饰器
修复做市商初始化失败问题。由于 ValidationPipe 配置了
forbidNonWhitelisted: true,没有装饰器的属性会被拒绝。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 05:47:03 -08:00
hailin 4bb5a2b09d refactor(frontend): 资产页面显示实际积分股和价格,移除倍数
- 资产显示值改为 积分股余额 × price(不含倍数)
- 显示实际积分股数量而非含倍数的有效积分股
- 简化实时计算逻辑

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 04:28:30 -08:00
hailin 8319fe5e9a fix(mining-admin): 修复 MiningConfigUpdated 事件缺少 minuteDistribution 字段
mining-service 发布的事件中只有 secondDistribution,CDC 同步时需要
计算 minuteDistribution = secondDistribution * 60

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 04:08:04 -08:00
hailin 7bc911d4d7 feat(mining): 实现手工补发挖矿功能
为从1.0系统同步的用户提供手工补发历史挖矿收益功能:

- mining-service: 添加 ManualMiningRecord 表和计算/执行补发逻辑
- mining-wallet-service: 添加 MANUAL_MINING_REWARD 交易类型和 Kafka 消费者
- mining-admin-service: 添加补发 API 控制器和代理服务
- mining-admin-web: 添加手工补发页面和侧边栏菜单项

功能特点:
- 根据用户算力和当前挖矿配置计算补发金额
- 每个用户只能执行一次补发操作
- 通过 Kafka 事件确保跨服务数据一致性
- 完整的操作记录和钱包同步状态追踪

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 03:50:03 -08:00
hailin 4a4393f995 fix(frontend): 修复资产页面实时增长计算漏乘burnMultiplier
资产每秒增长公式应为: 每秒积分股增长 × (1 + burnMultiplier) × price
之前漏掉了 (1 + burnMultiplier) 因子,导致增长被低估约5000倍

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-18 03:42:21 -08:00
hailin 5a719eef61 fix(trading): 合并销毁和快照任务确保K线价格正确
将 executeMinuteBurn 和 createPriceSnapshot 合并为单个 cron 任务,
确保快照在销毁完成后创建,避免K线出现价格不变的间隔

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 23:26:04 -08:00
hailin b826511f3c chore(frontend): 移除资产页面调试日志
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 23:16:15 -08:00
hailin 4eb466230e fix(mining): 修复 ShareAmount 类型调用方式
使用 .value.toNumber() 而非 .toNumber()

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 22:59:44 -08:00
hailin 4f1f1f9eaf feat(mining): 添加 perSecondEarning 到挖矿账户 API
- 后端:计算每秒收益 = (用户贡献 / 全网贡献) × 每秒分配量
- 前端:修正字段映射以匹配后端返回格式

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 22:56:19 -08:00
hailin 33233901a9 debug(frontend): 添加资产页面定时器调试日志
添加 debugPrint 日志帮助定位1秒刷新不生效的问题

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 22:49:46 -08:00
hailin d8dd38e91b fix(frontend): 修复资产页面1秒实时刷新问题
- 使用 mining-service 的 perSecondEarning 替代 assetGrowthPerSecond
- 添加 _timerStarted 标志防止定时器重复启动
- 修复页面进入时定时器可能未启动的问题
- 下拉刷新时同步刷新 shareAccountProvider
- 正确显示每秒增长值

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 22:42:23 -08:00
hailin 5c633b9979 perf(frontend): 优化交易页面数据刷新策略
- 价格刷新: 5分钟 → 15秒
- 订单刷新: 2分钟 → 10秒
- 市场概览: 5分钟 → 30秒
- 交易账户: 2分钟 → 15秒
- 交易成功后立即刷新 + 2秒后再刷新 + 5秒后最终刷新
- 确保用户能快速看到做市商吃单后的成交状态

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 22:14:52 -08:00
hailin b1fedd417f fix(trading): 修复 migration 执行顺序问题
将 0003_add_market_maker_depth 重命名为 0006_add_market_maker_depth,
确保在 0005_add_market_maker_and_order_source 创建 market_maker_configs 表
之后再执行添加深度字段的 migration。

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 21:59:06 -08:00
hailin 3265ee2506 feat(trading): 将初始绿积分从5760调整为57.6亿
## 变更说明

将积分股池的初始绿积分从 5760 调整为 5,760,000,000 (57.6亿)

### 价格影响

- 初始价格:5760 / 100.02亿 ≈ 5.76×10⁻⁷ → 57.6亿 / 100.02亿 ≈ 0.576
- 价格从极小数值变为正常交易数值
- 更符合实际交易习惯

### 技术评估

- 数据库精度 Decimal(30,8) 完全足够
- 价格公式线性放大,逻辑不变
- 销毁倍数公式不涉及 greenPoints,不受影响
- "越卖越涨"机制保持不变

### 修改文件

- prisma/seed.ts: 初始化种子数据
- burn.service.ts: 运行时初始化逻辑

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 21:50:43 -08:00
hailin 8c78f26e6d feat(trading): 实现做市商吃单/挂单模式互斥机制
## 后端 - trading-service

### MarketMakerService
- 新增 MarketMakerMode 类型:'idle' | 'taker' | 'maker'
- 新增 getCurrentMode() 和 getRunningStatus() 方法获取当前运行状态
- start() (吃单模式): 启动前自动停止挂单模式
- startMaker() (挂单模式): 启动前自动停止吃单模式
- 两种模式互斥,同一时间只能运行一种

### MarketMakerController
- getConfig 接口返回 runningStatus 运行状态
- 新增 GET /status 接口获取做市商运行状态

## 前端 - mining-admin-web

### 做市商管理页面
- 新增运行模式状态卡片,显示当前模式(空闲/吃单/挂单)
- 吃单模式和挂单模式使用 runningStatus 判断状态
- 添加互斥提示:启动一个模式会自动停止另一个
- 挂单模式添加警告提示:卖单被吃会触发销毁导致价格上涨

### API 更新
- 新增 RunningStatus 接口类型
- getConfig 返回类型增加 runningStatus
- 新增 getRunningStatus API

## 设计说明
- 吃单模式(推荐):做市商只作为买方,不触发额外销毁
- 挂单模式(谨慎使用):做市商挂卖单会触发销毁机制

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 21:41:39 -08:00
hailin 3b6bd29283 feat(trading): 实现完整的CEX做市商双边深度系统
## 后端 - trading-service

### 数据库模型扩展 (Prisma Schema)
- TradingConfig: 新增 depthEnabled 字段控制深度显示开关
- MarketMakerConfig: 新增双边挂单配置
  - makerEnabled: 做市商挂单模式开关
  - bidEnabled/askEnabled: 买/卖方向独立开关
  - bidLevels/askLevels: 买/卖档位数量
  - bidSpread/askSpread: 买/卖价差比例
  - bidLevelSpacing/askLevelSpacing: 档位间距
  - bidQuantityPerLevel/askQuantityPerLevel: 每档数量
  - refreshIntervalMs: 刷新间隔
- MarketMakerOrder: 新增做市商订单追踪模型
- MarketMakerLedger: 新增做市商账户流水模型

### 做市商服务 (MarketMakerService)
- depositShares/withdrawShares: 积分股充值/提现
- startMaker/stopMaker: 做市商挂单模式启停
- refreshMakerOrders: 核心双边挂单逻辑
  - 根据当前价格计算买卖各档位价格和数量
  - 自动撤销旧订单并创建新订单
  - 记录做市商订单关联
- cancelAllMakerOrders: 撤销所有做市商订单
- getDepth: 获取订单簿深度数据
- updateMakerConfig/getMakerOrders: 配置和订单查询

### API 端点
- MarketMakerController:
  - POST /deposit-shares: 积分股充值
  - POST /withdraw-shares: 积分股提现
  - POST /start-maker: 启动挂单模式
  - POST /stop-maker: 停止挂单模式
  - POST /refresh-orders: 手动刷新订单
  - POST /cancel-all-orders: 撤销所有订单
  - PUT /maker-config: 更新挂单配置
  - GET /maker-orders: 查询做市商订单
  - GET /depth: 获取深度数据
- AdminController:
  - GET/POST /trading/depth-enabled: 深度显示开关
- PriceController:
  - GET /depth: 公开深度接口 (受 depthEnabled 控制)

### 领域层扩展
- TradingAccountAggregate: 新增 depositShares/withdrawShares 方法
- OrderAggregate: 支持 source 字段标识订单来源

## 前端 - mining-admin-web

### 做市商管理页面 (/market-maker)
- 账户余额展示: 积分值和积分股余额
- 资金管理: 积分值/积分股的充值和提现对话框
- 吃单模式: 启动/停止/手动吃单控制
- 挂单模式: 启动/停止/刷新订单/撤销所有
- 深度开关: 控制公开 API 是否返回深度数据
- 深度展示: 实时显示买卖盘深度数据表格

### 前端架构
- market-maker.api.ts: 完整的 API 客户端
- use-market-maker.ts: React Query hooks 封装
- sidebar.tsx: 新增"做市商管理"导航菜单

## 数据库迁移
- 0003_add_market_maker_depth: 双边深度相关字段
- 0005_add_market_maker_and_order_source: 订单来源追踪

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 21:11:23 -08:00
hailin 416495a398 fix(mining): correctly parse network-progress API response
The API returns {success: true, data: {...}} but code was accessing
progressResult.currentContributionPerTree directly instead of
progressResult.data.currentContributionPerTree.

This caused:
- totalTreeCount to be 0 (undefined → 0)
- networkTotalContribution to be 0
- No mining distributions happening

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 09:46:25 -08:00
hailin 11ff3cc9bd fix: correct totalShares and distributionPool values
- totalShares: 100020000000 → 10002000000 (100.02亿 = 100亿 + 200万)
- distributionPool: 200000000 → 2000000 (200万)

Fixed in:
- trading-service/prisma/schema.prisma
- trading-service/prisma/migrations/0002_add_trading_burn_system/migration.sql
- mining-service/.env.example

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 09:01:17 -08:00
hailin 481a355d72 feat(trading): add buy function control switch with admin management
- Add buyEnabled field to TradingConfig in trading-service with migration
- Add API endpoints for get/set buy enabled status in admin controller
- Add buy function switch card in mining-admin-web trading page
- Implement buyEnabledProvider in mining-app with 2-minute cache
- Show "待开启" when buy function is disabled in trading page
- Add real-time asset value refresh in asset page (1-second updates)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 08:56:35 -08:00
hailin e8f3c34723 fix(contribution): 认种记录总贡献值显示用户实际有效算力
后端:
- get-planting-ledger.query.ts: 添加effectiveContribution字段
- 从contributionAccount获取用户实际的个人算力(personalContribution)

前端:
- planting_record.dart: PlantingSummary添加effectiveContribution字段
- planting_record_model.dart: 解析effectiveContribution字段
- planting_records_page.dart: 总贡献值显示effectiveContribution而非totalAmount

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 08:29:14 -08:00
hailin 613fb33ff9 refactor(frontend): 删除兑换页面卖出功能中的销毁比例标签
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 08:24:26 -08:00
hailin 6043d2fec8 fix(mining): calculate remainingDistribution from actual distributed amount
- Changed from reading config.remainingDistribution to calculating:
  remainingDistribution = distributionPool - totalDistributed
- Ensures data consistency: remaining + distributed = total pool
- Added Math.max(0, ...) to prevent negative values

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 08:19:05 -08:00
hailin 3e536115eb fix(mining): add defensive checks for network sync undefined values
- Handle missing currentContributionPerTree with default value
- Add null checks for all network progress fields
- Prevent DecimalError when contribution service returns incomplete data

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 01:19:18 -08:00
hailin 68a583508b fix(mining): correct progress calculation to use totalDistributed/distributionPool
Previously used (pool - remaining) / pool which was incorrect.
Now uses actual distributed amount / total pool for accurate percentage.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 01:10:54 -08:00
hailin d5f3f3b868 feat(frontend): 实现我的页面其他设置4项功能
- 消息通知: 添加开关控制,状态持久化到SharedPreferences
- 深色模式: 添加开关控制,状态持久化到SharedPreferences
- 帮助中心: 新建页面,包含常见问题FAQ和联系方式
- 关于我们: 新建页面,包含应用简介、功能特点、联系方式和法律条款

新增文件:
- settings_providers.dart: 设置状态管理
- help_center_page.dart: 帮助中心页面
- about_page.dart: 关于我们页面

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 01:08:21 -08:00
hailin 1e33ab178d fix(mining): move progress endpoint to MiningController for correct Kong routing
- Add /api/v2/mining/progress endpoint in MiningController
- Update frontend API to call /progress instead of /admin/mining/status
- Kong routes /api/v2/mining/* with strip_path=false, so endpoint must
  be under /mining controller path

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 00:55:58 -08:00
hailin 1aaf32cbb3 refactor(frontend): 认种记录汇总中总金额改为总贡献值
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 00:33:57 -08:00
hailin d424f2a18e refactor: rename '算力占比' to '贡献值占比' in mining records
- Update label in Flutter mining records page
- Update table header in admin web mining records list
- Update memo strings in mining-wallet-service

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 00:33:02 -08:00
hailin 49949ff979 fix(mining): use unified transaction to prevent timeout errors
- Wrap all database operations in executeSecondDistribution with
  UnitOfWork.executeInTransaction
- Pass transaction client to repository save methods
- Use longer transaction timeout (60s) for batch operations
- Move Redis operations outside transaction (non-ACID)
- Add distributeToSystemAndPendingInTx method that accepts tx client

This resolves the "Unable to start a transaction in the given time"
error caused by multiple concurrent transactions competing for
database connections.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 00:31:20 -08:00
hailin 725fb80f80 refactor(frontend): 删除我的页面中的支付密码功能
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 00:28:30 -08:00
hailin 76d6c30a20 refactor(frontend): 发送/接收积分股改名为积分值
- send_shares_page.dart: 标题改为发送积分值,提示信息同步更新
- receive_shares_page.dart: 标题改为接收积分值

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 00:26:44 -08:00
hailin 216394a44f refactor(trading): rename 卖出相关 黑洞 to 积分股池
- Update 10% 进入黑洞 to 10% 进入积分股池
- Update 销毁金额 to 进入积分股池 in sell confirmation
- Update 注意 text in sell confirmation
- Change color from red to green for 积分股池 text

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 00:15:38 -08:00
hailin aee64d9be8 fix(mining): add null safety to MiningConfigUpdated event payload
Prevent TypeError when config properties are undefined by using
optional chaining and default values in publishMiningConfigUpdated.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 00:10:28 -08:00
hailin 22702e898b fix(mining-admin): 修复仪表板待解锁算力显示为0的问题
- mining-admin-service: 新增 fetchContributionServiceStats() 方法,
  从 contribution-service API 获取完整的 pending 数据
- mining-admin-service: 重构 getDetailedContributionStats(),优先
  使用 API 数据,失败时回退到本地数据
- mining-service: 修复 publishMiningConfigUpdated 中使用已废弃的
  minuteDistribution 字段导致的错误

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 00:09:40 -08:00
hailin e80e672ffe feat(mining-admin): add mining progress dashboard component
Add real-time mining progress statistics similar to burn progress:
- Backend: new /admin/mining/status endpoint in mining-service
- Frontend: MiningProgress component with progress bar and stats
- Shows: total distributed, remaining pool, minutes left, per-minute rate
- Auto-refresh every 60 seconds via React Query

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-17 00:00:20 -08:00
hailin ea1e376939 chore(app): rename app to 股行
- Update Android app label in AndroidManifest.xml
- Update iOS CFBundleDisplayName and CFBundleName in Info.plist

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 23:50:23 -08:00
hailin 9deffe2565 fix(mining): resolve transaction timeout by using single transaction for system accounts
Problem:
- Multiple concurrent transactions updating system_mining_accounts caused row lock contention
- 16+ transactions waiting for tuple/transactionid locks led to timeout errors
- This prevented writeMinuteRecords() from executing, leaving mining_records empty

Solution:
- Modified SystemMiningAccountRepository.mine() to accept optional external transaction client
- Created new distributeToSystemAndPending() method that processes all system accounts
  and pending contributions in a single transaction
- Pre-calculate all rewards before transaction, then execute updates sequentially
- Aggregate all pending contribution rewards into single HEADQUARTERS update
- Move Redis accumulation outside transaction to avoid blocking

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 18:53:53 -08:00
hailin d5e5bf642c fix(kline-chart): prevent overflow in indicator selector and legend
- Wrap indicator selector Row in SingleChildScrollView for horizontal scrolling
- Add maxX boundary checks in _drawLegend to stop drawing when exceeding available space
- Prevents text overflow on narrow screens or when displaying many indicators

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 09:55:49 -08:00
hailin 27bf67e561 fix(kline-chart): improve pinch-to-zoom and fullscreen display
- Refactor to pixel-based scrolling system for smoother interaction
- Fix pinch-to-zoom to properly scale around focal point
- Adjust fullscreen layout to give more space to main chart (65%)
- Add candleWidth parameter to all painters for consistent rendering
- Detect multi-touch gestures using pointerCount

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 09:49:14 -08:00
hailin 0ebb0ad076 fix(contribution): use Symbol token for SYNCED_DATA_REPOSITORY injection
The GetTeamTreeQuery was importing SYNCED_DATA_REPOSITORY as a Symbol from
the domain interface, but InfrastructureModule defined its own string token.
This caused NestJS dependency resolution to fail.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 09:33:21 -08:00
hailin c84341be37 fix(mining): return totalDistributed (sum of totalMined) in admin status
The dashboard was incorrectly calculating distributed shares using
distributionPool - remainingDistribution. The correct value is the sum
of all users' totalMined balances. Updated mining-service to return
totalDistributed directly, and mining-admin-service to use it.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 09:27:15 -08:00
hailin b645621c81 fix(admin): add SystemAccountSynced event handler for system contribution sync
The mining-admin-service was only listening for SystemContributionUpdated
events, but contribution-service publishes SystemAccountSynced events.
Added the missing handler to properly sync system account contribution data.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 09:22:59 -08:00
hailin 1f0bd15946 feat(mining-app): add trading records page and remove withdrawal records
- Add TradingRecordsPage to display trade order history with status
- Connect trading records to profile page "交易记录" button
- Remove unused "提现记录" button from profile page
- Add route and navigation for trading records

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 09:22:15 -08:00
hailin 4ec6c9f48b feat(contribution/mining-app): add team tree API using contribution-service 2.0
Add team info and direct referrals endpoints to contribution-service,
using SyncedReferral data synced via CDC. Update mining-app to use the
new v2 contribution API instead of legacy referral-service.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 09:17:18 -08:00
hailin 3d6b6ae405 feat(mining-app): connect my team page from profile
Link the "我的团队" menu item in profile page to the team tree page,
completing the integration of the team tree feature.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 09:03:06 -08:00
hailin 64ccb8162a fix(admin): correct distributed shares calculation to use 2M pool
The dashboard was incorrectly using 5 billion as the distribution pool
default when calculating already distributed shares. The actual mining
distribution pool is 2 million shares, not 100 billion.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 08:59:46 -08:00
hailin 20a90fce4c feat(mining-app): add professional kline chart with technical indicators
- Add KlineChartWidget with pinch-to-zoom, fullscreen mode
- Implement MA, EMA, BOLL indicators for main chart
- Implement MACD, KDJ, RSI indicators for sub chart
- Add volume display with crossline info
- Add C2C trading feature with market/publish/detail pages
- Add P2P transfer functionality (send/receive shares)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 08:51:00 -08:00
hailin 3ce8bb0044 fix(mining-admin): parse burn records response correctly
The trading-service wraps all responses with { success, data, timestamp }.
Need to extract data.data for burn records endpoint.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 08:41:16 -08:00
hailin 7852b9d673 feat(mining): publish CDC events for mining-admin-service sync
Add event publishing to enable mining-admin-service to sync data via
Debezium CDC instead of direct API calls:

- MiningConfigUpdated: Published every minute with distribution status
- DailyMiningStatCreated: Published when daily stats are generated
- MiningAccountUpdated: Method added for future per-account sync

These events will be captured by Debezium monitoring the outbox_events
table and forwarded to mining-admin-service via Kafka.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 08:33:02 -08:00
hailin 9d65eef1b1 fix(mining-admin): fetch dashboard data from remote services
Dashboard now fetches totalDistributed and totalBurned directly from
mining-service and trading-service APIs instead of relying solely on
CDC sync which may not have data.

- Add fetchRemoteServiceData() to get real-time data
- Use mining-service /admin/status for totalDistributed
- Use trading-service /asset/market for totalBurned and circulationPool
- Add 30-second cache to reduce API calls

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 08:25:07 -08:00
hailin 3096297198 feat(mining-app): 资产页面优化及个人资料编辑功能
- 删除资产页面的"提现"按钮,将"划转"改为"C2C"
- 删除积分值卡片上的"可提现"标签
- 简化资产页面和兑换页面的标题栏,移除左右图标
- 统一资产页面背景色与兑换页面一致
- 新增个人资料编辑页面,支持头像颜色选择和昵称修改
- 头像和昵称支持本地存储

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 07:58:16 -08:00
hailin 854bb7a0ac fix(mining-admin): correct ContributionSyncStatus property names
Property names should match what's used in the UI component:
- miningNetworkTotal (was miningTotal)
- networkTotalContribution (was contributionTotal)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 07:41:59 -08:00
hailin 2534068f70 fix(mining): remove duplicate burn mechanism from mining-service
Mining-service incorrectly implemented its own burn mechanism (10-year
cycle) which was not in the requirements. Per requirements, only
trading-service should handle per-minute burn (4756.47/minute).

Removed:
- BlackHoleRepository and all burn-related methods
- executeBurn() from mining distribution service
- Burn stats from admin API and queries
- Burn progress UI from mining admin web

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 07:31:13 -08:00
hailin f22c3efb11 fix: use correct property name 'type' for unallocated contribution
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 05:40:40 -08:00
hailin 0241930011 feat(contribution/mining): sync unallocated contributions to mining-service
- Add UnallocatedContributionSyncedEvent in contribution-service
- Add event handler in mining-service's contribution-event.handler.ts
- Add handleUnallocatedContributionSynced in network-sync.service.ts
- Add admin endpoint to publish all unallocated contributions
- Sync pending/unallocated contributions to PendingContributionMining table

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 05:29:28 -08:00
hailin 130bf57842 fix(contribution): 处理认种时发布系统账户同步事件
- 在保存系统账户算力后,发布 SystemAccountSyncedEvent
- 使 mining-service 能够同步运营/省/市公司的算力

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 05:24:04 -08:00
hailin 962e7874c8 fix(contribution): 修复全网理论算力未同步问题
- 在 calculateForAdoption 中调用 updateNetworkProgress 更新 NetworkAdoptionProgress 表
- 之前 publishNetworkProgressEvent 读取的 totalTreeCount 始终为 0,因为表未被更新

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 04:55:30 -08:00
hailin bb75ff19a4 feat(contribution): 认种处理后自动发布全网进度事件
- 每次认种分配完成后发布 NetworkProgressUpdatedEvent
- mining-service 通过 Kafka 实时接收全网理论算力更新
- 定时同步改为每5分钟一次,作为兜底方案

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 04:46:12 -08:00
hailin 23bb8baa9c feat(mining): 自动同步全网理论算力
- 启动时自动从 contribution-service 同步全网数据
- 每分钟定时同步全网理论算力和系统账户算力
- 使用分布式锁防止并发同步

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 04:44:27 -08:00
hailin 7909bcc3d1 fix(mining-admin): 修复同步状态判断逻辑
- 同步判断改为只检查全网理论算力是否同步
- 全网理论算力是挖矿分母,是判断同步完成的核心指标
- 使用相对误差(0.1%)而非绝对误差来判断同步

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 04:07:18 -08:00
hailin 9e15fa4fd8 fix(admin-web): 修复全网算力同步状态显示
- 将显示内容从用户有效算力改为全网理论算力
- 显示 mining-service 和 contribution-service 的全网理论算力对比

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 04:02:29 -08:00
hailin de5416aee6 feat(mining): 实现系统账户和待解锁算力参与挖矿
重大变更:
- 挖矿分母从用户有效算力改为全网理论算力(networkTotalContribution)
- 系统账户(运营12%/省1%/市2%)参与挖矿,有独立的挖矿记录
- 待解锁算力参与挖矿,收益归总部账户,记录包含完整来源信息

新增功能:
- mining-service: 系统挖矿账户表、待解锁算力表及相关挖矿记录表
- mining-service: NetworkSyncService 同步全网数据
- mining-service: /admin/sync-network 和 /admin/system-accounts 端点
- contribution-service: /admin/system-accounts 和发布系统账户事件端点
- mining-admin-service: 状态检查返回全网理论算力信息

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 03:39:56 -08:00
hailin b5fca7bb04 fix(mining-admin): 修复算力同步状态检查的 API 路径
- contribution-service: 给 /contribution/stats 接口添加 @Public() 装饰器
- mining-admin-service: 修正 API 路径从 api/v1 改为 api/v2

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 03:09:05 -08:00
hailin 7c00c900a0 feat(mining-admin): 算力同步完成前禁用激活挖矿按钮
- 后端:getMiningStatus 接口并行获取 contribution-service 总算力,对比两边是否一致
- 前端:未同步时显示"全网算力同步中..."提示,禁用激活按钮
- 前端:同步中每 3 秒刷新状态,同步完成后恢复 30 秒刷新

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 03:00:40 -08:00
hailin 72b3b44d37 feat(trading): 实现K线图真实数据展示与Y轴自适应
后端 (trading-service):
- 新增 GET /api/v2/price/klines API 端点
- 支持多周期K线聚合 (1m/5m/15m/30m/1h/4h/1d)
- 将 PriceSnapshot 数据聚合为 OHLC 格式

前端 (mining-app):
- 添加 klinesProvider 获取K线数据
- 重写 _CandlestickPainter 使用真实数据
- 实现 Y轴自适应显示,放大价格变化
- 周期选择器联动数据刷新

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 23:57:12 -08:00
hailin 8ab11c8f50 feat(wallet): sync burn events from trading-service to deduct SHARE_POOL_A
Add Kafka consumer to listen for burn events (minute burn and sell burn)
from trading-service and deduct from SHARE_POOL_A (100B pool), updating
BLACK_HOLE_POOL balance accordingly.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 23:24:43 -08:00
hailin 88368d1705 fix(wallet): 统一使用 accountSequence 查询钱包,修复转账余额不足问题
背景:幽灵钱包 D26010800000 (user_id=133, 余额=0) 导致真实用户
D26010900000 (user_id=0, 余额=200465) 转账失败

原因:
- D26010800000 是 2026-01-08 16:23 通过未知方式创建的脏数据
- 真实用户 D26010900000 在 18:40 注册时,user_id=133 已被占用
- getMyWallet 用 accountSequence 查询显示余额正确
- requestWithdrawal 用 userId 查询找到错误的空钱包

修复:
- Controller: 传 user.accountSequence 而非 user.userId
- Service: 移除 findByUserId fallback,仅用 findByAccountSequence
- 从钱包记录获取 userId 用于订单、流水、事件关联

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 23:00:40 -08:00
hailin 974d660544 feat(mining): sync pool balance via Kafka when mining distributes
- mining-service: publish MINING_MINUTE_DISTRIBUTED event to Kafka after
  each minute's mining distribution is completed
- mining-wallet-service: add MiningDistributionConsumer to consume the
  event and deduct from SHARE_POOL_B
- Add deductFromSharePoolB method in PoolAccountService
- This ensures the share pool balance displayed in mining-app reflects
  the actual remaining balance after mining distributions

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 21:30:35 -08:00
hailin 7b3c222b24 fix(admin-web): use dedicated trading client with correct base URL
Trading API was incorrectly routed through mining-admin baseURL in production,
causing 404 errors. Created independent tradingClient with /api/trading baseURL
to properly route requests through Next.js rewrites to Kong -> trading-service.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 21:12:33 -08:00
hailin 52a5ae64c0 Revert "fix(admin-web): add API_GATEWAY_URL env var for Docker build"
This reverts commit 1d7f05b12d.
2026-01-15 21:03:57 -08:00
hailin 1d7f05b12d fix(admin-web): add API_GATEWAY_URL env var for Docker build
The Next.js rewrites in next.config.js require API_GATEWAY_URL to be set
at build time. Added this environment variable to both Dockerfile and
docker-compose.yml to ensure proper routing to Kong gateway in production.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 21:01:27 -08:00
hailin 967e6c1f44 fix(admin-web): fix API response data extraction for trading endpoints
- Add proper extraction of nested data from { success, data, timestamp } response format

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 20:54:26 -08:00
hailin 2da02e0823 fix(admin-web): fix trading-service proxy routing for Kong gateway
- Add production/development environment detection
- Production: route through Kong gateway at /api/v2/trading/*
- Development: direct connection to trading-service at localhost:3022

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 20:48:31 -08:00
hailin 8018fa5110 feat(admin): add trading system management UI and API
- Add trading system activate/deactivate endpoints to trading-service
- Add trading management page to mining-admin-web with:
  - Trading system status display and control
  - Market overview (price, green points, circulation pool)
  - Burn progress visualization
  - Burn records list with filtering
- Add trading-service proxy configuration to next.config.js
- Add trading menu item to sidebar navigation

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 20:37:52 -08:00
hailin 1e2d8d1df7 feat(asset): aggregate mining and trading account balances in asset display
- Modify AssetService to fetch mining account balance from mining-service
- Sum mining balance + trading balance for total share display
- Add miningShareBalance and tradingShareBalance fields to AssetDisplay
- Update frontend entity and model to support new fields

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 19:53:31 -08:00
hailin ed715111ae fix(trading): auto-initialize SharePool and CirculationPool on startup
- Add SharePool and CirculationPool initialization in BurnService.initialize()
- Initialize SharePool with 5760 green points (fixes price showing as 0)
- Remove misleading "= 5,760 积分值" display from trading page

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 19:32:46 -08:00
hailin e611894b55 fix(trading-service): use payload.sub as accountSequence in JWT guard
auth-service puts accountSequence in payload.sub, not payload.accountSequence.
This mismatch caused 401 errors when accessing trading endpoints.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 17:54:28 -08:00
hailin 83b05ac146 fix(docker): add JWT_SECRET to mining-service and trading-service
Both services were missing JWT_SECRET environment variable, causing
401 Unauthorized errors when validating JWT tokens from auth-service.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 17:25:55 -08:00
hailin 01bd638dbb fix(contribution-service): add parent .env path for shared config
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 07:00:02 -08:00
hailin 7a469be7cd fix(mining-*): add parent .env path for shared config
All mining services need to read shared environment variables
(JWT_SECRET, DATABASE_URL, etc.) from backend/services/.env

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 06:58:56 -08:00
hailin 0420b0acab fix(trading,auth): add parent .env path for shared JWT_SECRET
Both services need to read JWT_SECRET from the shared .env file
in the parent directory (backend/services/.env).

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 06:57:57 -08:00
hailin 4440f40fba fix(mining-wallet-service): use upsert in seed for 100% overwrite
Remove existence check, directly upsert pool accounts to ensure
consistent state on every seed run.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 06:42:34 -08:00
hailin fdff3a3119 feat(mining-wallet-service): add migration for SHARE_POOL_A and SHARE_POOL_B
Split the share pool into two accounts:
- SHARE_POOL_A: 100亿 for burning
- SHARE_POOL_B: 200万 for mining distribution
Total: 100.02亿

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 06:40:56 -08:00
hailin 4b1855f57a fix(mining-app): use public API for asset page to avoid JWT mismatch
Changed from myAssetProvider (requires JWT) to accountAssetProvider
(public API) to avoid 401 errors when trading-service JWT_SECRET
doesn't match auth-service.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 06:34:19 -08:00
hailin 4cef31b1d1 fix(api-gateway): correct mining-admin-service upstream URL to /api/v2
The service uses 'api/v2' as global prefix, not 'api/v1'.
Request flow: /api/v2/mining-admin/auth/login -> strip path -> /auth/login -> upstream /api/v2/auth/login

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 06:29:10 -08:00
hailin 109986ba49 fix(mining-wallet-service): move share-pool-balance route before :type param route
NestJS routes are matched in order, so the parameterized :type route
was capturing 'share-pool-balance' before it could reach the public
endpoint, causing 401 errors.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 06:22:30 -08:00
hailin b5899497ea fix(mining-wallet-service): use SHARE_POOL_A instead of SHARE_POOL for mining rewards 2026-01-15 06:05:17 -08:00
hailin 40869ef00f feat: split share pool into A (100亿) and B (200万) accounts
Backend changes:
- mining-wallet-service: Split SHARE_POOL into SHARE_POOL_A (100亿, for burning)
  and SHARE_POOL_B (200万, for mining distribution)
- Add /pool-accounts/share-pool-balance API endpoint to get total balance
- Update pool initialization logic and seed data
- Fix Kong routing for mining-wallet-service (strip_path: true)
- Fix Kong routing for trading-service (strip_path: true)

Constant updates (100.02亿 = 10,002,000,000):
- mining-service: TOTAL_SHARES
- trading-service: TOTAL_SHARES, trading config defaults
- trading-service seed: initial green points = 5760

Frontend changes:
- Add sharePoolBalanceProvider to fetch pool balance from mining-wallet-service
- Update contribution page to display real-time share pool balance (A + B)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 05:55:52 -08:00
hailin a1508b208e fix(api-gateway): correct Kong routing for trading-service
- Change strip_path to true to strip /api/v2/trading prefix
- Add /api/v2 to upstream URL so requests route correctly
- Revert accidental main.ts change

Request flow: /api/v2/trading/asset/market -> strip /api/v2/trading -> /asset/market -> upstream /api/v2/asset/market

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 05:27:46 -08:00
hailin c60d3b2f26 fix(trading-service): correct global prefix to match Kong routing
Change prefix from 'api/v2' to 'api/v2/trading' to match Kong gateway
configuration with strip_path: false.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 05:26:11 -08:00
hailin bb4143d75b fix(trading-service): exclude prisma from tsconfig to fix build output path 2026-01-15 04:46:01 -08:00
hailin d12bbb17be feat(mining-app): add share pool balance display on contribution page
Display real-time share pool balance (积分股池实时余量) in the total
contribution card on the contribution page.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 04:27:48 -08:00
hailin 19428a8cb7 feat(trading-service): sync trading account creation with wallet service
- Add CDC consumer to listen for UserWalletCreated events from mining-wallet-service
- Create trading accounts when user contribution wallets are created (lazy creation)
- Add WalletSystemAccountCreated handler for province/city system accounts
- Add seed script for core system accounts (HQ, operation, cost, pool)
- Keep auth.user.registered listener for V2 new user registration

This ensures trading accounts are created in sync with wallet accounts,
supporting both V2 new users and V1 migrated users.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 04:27:14 -08:00
hailin 183b2bef59 style(mining-app): hide accountSequence and rename phone to ID on profile page
- Remove accountSequence (ID: xxxx) display from profile page
- Rename "手机:" label to "ID:" for phone number display

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 03:39:18 -08:00
hailin 1bdb9bb336 style(mining-admin-web): display all numbers with 8 decimal places
Update all formatDecimal, formatNumber, formatPercent, formatCompactNumber
and formatShareAmount calls to use 8 decimal precision for consistent display
across all pages (dashboard, users, reports, system-accounts).

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 02:55:23 -08:00
hailin d7bbb19571 fix(mining-admin-service): correct effective contribution calculation
Effective contribution should equal theoretical total (totalTrees * 22617)
since it includes all parts: personal 70% + operation 12% + province 1% +
city 2% + level 7.5% + bonus 7.5% = 100%.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 02:20:45 -08:00
hailin 420dfbfd9f fix(mining-admin-web): display theoretical network contribution instead of effective
Changed "全网算力" card to show theoretical total (totalTrees * 22617) instead
of effective contribution. Added effective contribution to subValue for reference.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 02:04:08 -08:00
hailin cfbf1b21f3 feat(dashboard): add detailed contribution breakdown by category
Backend (contribution-service):
- Add getDetailedContributionStats() to repository
- Add getUnallocatedByLevelTier/BonusTier() to repository
- Extend stats API with level/bonus breakdown by tier
- Add getTotalTrees() to synced-data repository

Backend (mining-admin-service):
- Add detailed contribution stats calculation
- Calculate theoretical vs actual values per category
- Return level/bonus breakdown with unlocked/pending amounts

Frontend (mining-admin-web):
- Add ContributionBreakdown component showing:
  - Personal (70%), Operation (12%), Province (1%), City (2%)
  - Level contribution (7.5%) by tier: 1-5, 6-10, 11-15
  - Bonus contribution (7.5%) by tier: T1, T2, T3
- Update DashboardStats type definition
- Integrate breakdown component into dashboard page

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 01:43:37 -08:00
hailin 1f15daa6c5 fix(planting-records): filter only MINING_ENABLED records and fix UI overflow
- Backend: Add status filter to getPlantingLedger and getPlantingSummary
- Frontend: Change Row to Wrap for info items to prevent width overflow

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 00:12:07 -08:00
hailin 8ae9e217ff fix(mining-app): fix mining records data parsing from mining-service
Map miningMinute->distributionMinute, minedAmount->shareAmount,
secondDistribution->priceSnapshot to match entity fields

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 00:02:30 -08:00
hailin 12f8fa67fc feat(mining-admin): add totalTrees, separate level/bonus pending display
- Add totalTrees field from syncedAdoption aggregate
- Rename fields: networkLevelPending, networkBonusPending
- Stats card: show level pending and bonus pending separately
- Add new stats card for total trees count
- Price overview: 2-row layout showing all contribution metrics

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:59:32 -08:00
hailin b310fde426 feat(mining-admin): show pending contribution in dashboard
- Add networkPendingContribution and networkBonusPendingContribution to API
- Display combined pending contribution (teamLevel + teamBonus) in stats card
- Replace 'total contribution' with 'pending contribution' in price overview

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:46:05 -08:00
hailin 81a58edaca fix(contribution-service): calculate totalContribution correctly in CDC event
Previously, totalContribution was incorrectly set to effectiveContribution.
Now correctly calculated as: personal + teamLevel + teamBonus

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:40:50 -08:00
hailin debc8605df fix(mining-app): rename MiningRecordsPage widget to avoid name conflict
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:33:38 -08:00
hailin dee9c511e5 feat(mining-admin): add total contribution to dashboard stats
- Add networkTotalContribution field to dashboard API response
- Display total hashrate alongside effective hashrate in stats cards
- Update price overview to show both effective and total contribution
- Change grid from 3 to 4 columns in price overview

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:32:29 -08:00
hailin 546c0060da feat(mining-app): add mining records and planting records pages
- Add mining records page showing distribution history with share amounts
- Add planting records page with adoption summary and detailed records
- Remove 推广奖励 and 收益明细 from profile page
- Add planting-ledger API endpoint and data models

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 23:23:31 -08:00
hailin b81ae634a6 fix(mining-app): hardcode team bonus tiers display to 15
- Profile page: 团队上级 shows '15' instead of actual unlockedBonusTiers
- Contribution page: 已解锁上级 shows '15级' instead of actual value

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 20:28:02 -08:00
hailin 0cccc0e2cd refactor(mining-app): rename VIP等级 to 团队上级 and 直推人数 to 引荐人数
- Changed "VIP等级" label to "团队上级" in profile stats row
- Changed display value from vipLevel (V3 format) to unlockedBonusTiers (raw number)
- Changed "直推人数" label to "引荐人数" for consistency

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 20:07:52 -08:00
hailin cd938f4a34 refactor(mining-app): rename team contribution labels
Update contribution page labels:
- "团队层级" → "团队下级"
- "团队奖励" → "团队上级"
- "直推人数" → "引荐人数"
- "已解锁奖励" → "已解锁上级" (with unit "档" → "级")
- "已解锁层级" → "已解锁下级"
- "直推及间推" → "引荐及间推" in subtitle

Update contribution records page labels:
- "团队层级" → "团队下级"
- "团队奖励" → "团队上级"

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:58:41 -08:00
hailin 84fa3e5e19 refactor(mining-app): rename 绿积分 to 积分值 across all pages
Replace all occurrences of "绿积分" with "积分值" in:
- trading_page.dart (price display, pool name, input field)
- asset_page.dart (account labels)
- trading_account.dart (entity comment)
- price_info.dart (entity comment)
- market_overview.dart (entity comment)
- DEVELOPMENT_GUIDE.md (documentation)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:57:17 -08:00
hailin adeeadb495 fix(mining-app): update profile page - hide items and rename label
- Rename "团队层级" to "团队下级" in stats row
- Hide "实名认证" option from account settings
- Hide "我的邀请码" card section entirely
- Remove unused _buildInvitationCard and _buildActionButton methods

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:47:14 -08:00
hailin 42a28efe74 fix(mining-app): remove operator account note from expiration card
Remove the "运营账号贡献值永不失效" note from the contribution
expiration countdown card.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:28:31 -08:00
hailin 91b8cca41c feat(mining-app): implement hide/show amounts toggle
- Add hideAmountsProvider to control amount visibility
- Add tap handler to eye icon in total contribution card
- Toggle icon between visibility_outlined and visibility_off_outlined
- Hide amounts with **** when toggled in:
  - Total contribution value
  - Three column stats (personal, team level, team bonus)
  - Today's estimated earnings
  - Contribution detail summary rows

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:22:03 -08:00
hailin 02cc79d67a fix(mining-app): reduce bottom padding on navigation pages
Reduce bottom SizedBox from 100 to 24 on all four main navigation
pages (contribution, trading, asset, profile) to eliminate excessive
whitespace when scrolling to bottom.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:17:59 -08:00
hailin 7bc8547a96 fix(mining-app): rename ContributionRecordsListPage to avoid name conflict
- Rename page class from ContributionRecordsPage to ContributionRecordsListPage
- Add typedef RecordsPageData for ContributionRecordsPage data model
- Fix import statements and unused variable

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:08:09 -08:00
hailin caffb124d2 feat(mining-app): add contribution records page with category summary
- Create contribution_records_page.dart with full list view
  - Pagination support with page navigation
  - Filter by source type (personal, team level, team bonus)
  - Show detailed info: tree count, base contribution, rate, amount
  - Display effective/expire dates and status badges

- Update contribution_page.dart detail card
  - Show category summary instead of record list
  - Display three categories with icons: personal, team level, team bonus
  - Add navigation to full records page via "查看全部"

- Add route configuration for /contribution-records

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 19:02:30 -08:00
hailin 141db46356 fix(contribution-service): use real contributionPerTree from rate service
Previously, adoptions were synced with hardcoded contributionPerTree=1,
resulting in contribution values like 0.7 instead of the expected 15831.9.

Now the handler fetches the actual contribution rate from ContributionRateService
based on the adoption date, storing values like:
- Personal (70%): 22617 × 70% = 15831.9
- Team Level (0.5%): 22617 × 0.5% = 113.085
- Team Bonus (2.5%): 22617 × 2.5% = 565.425

Note: Historical data may need migration to apply the correct multiplier.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 18:01:30 -08:00
hailin f57b0f9c26 chore(mining-app): configure release build
- Add kDebugMode check to LoggingInterceptor to suppress logs in release
- Remove debug print statements from contribution_providers
- Add Play Core proguard rules to fix R8 missing classes error

Build command: flutter build apk --release --split-per-abi --target-platform android-arm,android-arm64
Output:
- app-arm64-v8a-release.apk: 18MB
- app-armeabi-v7a-release.apk: 16MB

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 17:24:42 -08:00
hailin c852f24a72 fix(auth-service): add 'auth/' prefix to controller routes for Kong compatibility
Kong routes /api/v2/auth/* to auth-service without stripping the path,
so controllers need 'auth/' prefix to match frontend requests:
- SmsController: 'sms' -> 'auth/sms'
- PasswordController: 'password' -> 'auth/password'
- UserController: 'user' -> 'auth/user'

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 08:53:48 -08:00
hailin cb3c7623dc fix(mining-app): fix Riverpod ref usage in router redirect callback
Use cached auth state from AuthNotifier instead of ref.read() to avoid
"Cannot use ref functions after provider changed" exception during rebuild.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 08:49:52 -08:00
hailin f2692a50ed fix(contribution-service): fix toRecordDto using wrong property name
- Changed `record.finalContribution` to `record.amount` for getting final contribution value
- Added optional chaining to prevent undefined errors
- Added default values for safety

The ContributionRecordAggregate uses `amount` property, not `finalContribution`.
This was causing "Cannot read properties of undefined (reading 'value')" errors.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 08:43:14 -08:00
hailin ed9f817fae feat(mining-app): add estimated earnings and contribution stats API
- Add ContributionStats entity and model for network-wide statistics
- Add /api/v2/contribution/stats endpoint
- Implement estimatedEarningsProvider to calculate daily earnings
- Formula: (user contribution / total contribution) × daily allocation
- Update contribution page to display real estimated earnings
- Add debug logs for contribution records API

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 08:37:30 -08:00
hailin 6bcb4af028 feat(mining-app): integrate real APIs for Asset and Profile pages
- Asset page now uses trading-service /asset/my endpoint
- Profile page integrates auth-service /user/profile and contribution-service
- Add new entities: AssetDisplay, PriceInfo, MarketOverview, TradingAccount
- Add corresponding models with JSON parsing
- Create asset_providers and profile_providers for state management
- Update trading_providers with real API integration
- Extend UserState and UserInfo with additional profile fields
- Remove obsolete buy_shares and sell_shares use cases
- Fix compilation errors in get_current_price and trading_page

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 08:22:40 -08:00
hailin 106a287260 fix(mining-service): make health endpoints public 2026-01-14 07:35:42 -08:00
hailin 30dc2f6665 fix(trading-service): make health endpoints public 2026-01-14 07:28:24 -08:00
hailin e1fb70e2ee feat(trading-service): add burn system, Kafka events, and idempotency
- Add trading burn system with black hole, share pool, and price calculation
- Implement per-minute auto burn and sell burn with multiplier
- Add Kafka event publishing via outbox pattern (order, trade, burn events)
- Add user.registered consumer to auto-create trading accounts
- Implement Redis + DB dual idempotency for event processing
- Add price, burn, and asset API controllers
- Add migrations for burn tables and processed events

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 07:15:41 -08:00
hailin f3d4799efc feat(mining-wallet): add UserWalletCreated/Updated events for CDC sync
- Publish UserWalletCreated when a new wallet is created
- Publish UserWalletUpdated when wallet balance changes
- Events sent to cdc.mining-wallet.outbox topic for mining-admin-service

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 06:13:34 -08:00
hailin 839feab97d fix(mining-admin): handle CONTRIBUTION_CREDITED event for wallet sync
Add handler for CONTRIBUTION_CREDITED events from mining-wallet-service
to sync user wallet data to synced_user_wallets table.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 06:11:49 -08:00
hailin 465e398040 fix(mining-admin): fix wallet ledger API to match frontend expected format
- Return usdtAvailable, usdtFrozen, pendingUsdt, settleableUsdt,
  settledTotalUsdt, expiredTotalUsdt instead of old field names
- Query SyncedUserWallet table for GREEN_POINTS wallet data
- Use miningAccount.availableBalance for pendingUsdt

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 05:56:24 -08:00
hailin c6c875849a fix(mining-service): make mining API public for service-to-service calls
Add @Public() decorator to MiningController to allow mining-admin-service
to fetch mining records without authentication.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 05:46:11 -08:00
hailin ce95c40c84 fix(mining-service): listen to correct CDC topic for contribution sync
Changed event handler to:
- Listen to 'cdc.contribution.outbox' topic (CDC/Debezium format)
- Handle 'ContributionAccountUpdated' events instead of 'ContributionCalculated'
- Use effectiveContribution for mining power calculation

This fixes the issue where mining accounts had zero totalContribution
because they weren't receiving contribution sync events.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 05:30:38 -08:00
hailin e6d966e89f fix(mining-admin): fetch mining records from mining-service
Update getUserMiningRecords to call mining-service API instead of
returning empty records. This enables the admin dashboard to display
actual user mining records.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 05:14:03 -08:00
hailin 270c17829e fix(mining-admin-service): move mining routes before :category/:key parameter route
NestJS matches routes in definition order. The :category/:key route was
matching mining/status before the specific mining routes. Moved mining
routes before the parameter routes to fix routing.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:57:25 -08:00
hailin 289ac0190c fix(mining-admin-service): add logging and fix null data handling in getMiningStatus
- Add debug logging to trace mining service calls
- Return error object instead of null when data is missing
- Include error message in response for debugging

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:42:01 -08:00
hailin 467d637ccc fix(mining-admin-web): prevent duplicate /api/v2 in rewrite destination
Clean NEXT_PUBLIC_API_URL to remove trailing /api/v2 if present,
preventing paths like /api/v2/api/v2/configs/mining/status

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:37:32 -08:00
hailin c9690b0d36 Revert "fix(mining-admin-web): always use /api proxy instead of direct API URL"
This reverts commit 7a65ab3319.
2026-01-14 04:34:22 -08:00
hailin 7a65ab3319 fix(mining-admin-web): always use /api proxy instead of direct API URL
Browser cannot access Docker internal URLs like http://mining-admin-service:3023.
Always use /api which is proxied by Next.js rewrites to the backend service.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:32:59 -08:00
hailin e99b5347da feat(mining-admin-service): add transfer-enabled API endpoints
Add GET and POST /configs/transfer-enabled endpoints to control
the transfer switch. Routes are placed before :category/:key to
avoid being matched as path parameters.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:22:11 -08:00
hailin 29dd1affe1 fix(mining-admin-web): extract data from response wrapper
mining-admin-service uses TransformInterceptor which wraps all responses
with { success, data, timestamp } format. Frontend needs to access
response.data.data to get the actual data.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 04:18:51 -08:00
hailin a15dcafc03 fix(mining-admin-service): 解包mining-service返回的data字段 2026-01-14 04:09:02 -08:00
hailin d404521841 fix(mining-admin-service): 修复mining-service API路径为v2 2026-01-14 03:58:02 -08:00
hailin 09b15da3cb fix(mining-service): Redis锁使用毫秒PX代替秒EX支持小数TTL 2026-01-14 03:52:22 -08:00
hailin 901247366d fix(mining-service): 添加tsconfig include/exclude配置修复构建 2026-01-14 03:48:18 -08:00
hailin 0abc04b9cb fix(mining-service): 添加Dockerfile构建验证步骤 2026-01-14 03:45:51 -08:00
hailin 2b083991d0 feat(mining-service): 添加migration将minuteDistribution改为secondDistribution
支持每秒挖矿分配功能

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 03:40:41 -08:00
hailin 8f616dd45b fix(mining-service): 修复Dockerfile支持prisma seed
- 添加ts-node/typescript到生产环境以支持seed执行
- 启动脚本中添加prisma db seed执行
- 复制tsconfig.json到生产环境

参考mining-wallet-service的Dockerfile配置

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 03:35:34 -08:00
hailin 1008672af9 Revert "fix(mining-service): 修复Docker构建问题"
This reverts commit f4380604d9.
2026-01-14 03:34:58 -08:00
hailin f4380604d9 fix(mining-service): 修复Docker构建问题
- tsconfig.json 添加 include/exclude 排除 prisma 文件夹
- 添加 .dockerignore 排除 seed.ts
- Dockerfile 添加构建验证

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 03:34:04 -08:00
hailin 3b61f2e095 feat(mining): 实现每秒挖矿分配系统
核心改动:
- 调度器从每分钟改为每秒执行,用户每秒看到挖矿收益
- 每秒更新账户余额,但MiningRecord每分钟汇总写入一次(减少数据量)
- seed自动执行(prisma.seed配置),初始化后isActive=false
- 只有一个手动操作:管理员在后台点击"启动挖矿"

技术细节:
- 每秒分配量:100万/63,072,000秒 ≈ 0.01585 shares/秒
- Redis累积器:每秒挖矿数据累积到Redis,每分钟末写入数据库
- 分布式锁:0.9秒锁定时间,支持多实例部署
- 后台管理界面:添加挖矿状态卡片和激活/停用按钮

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 03:25:47 -08:00
hailin 25608babd6 feat(mining-service): add initialization APIs and seed script
Add admin endpoints:
- GET /admin/status - Get mining system status
- POST /admin/initialize - Initialize mining config (one-time)
- POST /admin/activate - Activate mining distribution

Add prisma seed script for database initialization:
- MiningConfig: 100.02B total shares, 200万 distribution pool
- BlackHole: 100亿 burn target
- MiningEra: First era with 100万 distribution
- PoolAccounts: SHARE_POOL, BLACK_HOLE_POOL, CIRCULATION_POOL

Based on requirements:
- 第一个两年分配100万积分股
- 第二个两年分配50万积分股(减半)
- 100亿通过10年销毁到黑洞

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:36:52 -08:00
hailin bd0f98cfb3 fix(mining-admin-web): fix audit logs page crash
- Use 'all' instead of empty string for SelectItem value (Radix requirement)
- Add null safety for items array with fallback to empty array
- Fix potential undefined access on data.items

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:30:07 -08:00
hailin a2adddbf3d fix(mining-admin): transform dashboard API response to match frontend expected format
Frontend expects flat DashboardStats and RealtimeData interfaces.
Transform backend nested response to:
- totalUsers, adoptedUsers, networkEffectiveContribution, etc.
- currentMinuteDistribution, activeOrders, pendingTrades, etc.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:23:54 -08:00
hailin d6064294d7 refactor(mining-admin): remove initialization feature
System initialization is now handled by seed scripts and CDC sync,
so the manual initialization UI is no longer needed.

Removed:
- Frontend: initialization page and sidebar menu item
- Backend: InitializationController and InitializationService

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:22:23 -08:00
hailin 36c3ada6a6 fix(mining-admin): fix audit logs API path and response format
- Change controller path from /audit-logs to /audit to match frontend
- Transform response to frontend expected format (items, totalPages, etc.)
- Map admin.username to adminUsername field
- Add keyword query parameter support

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:18:53 -08:00
hailin 13e94db450 feat(mining-admin): add /reports/daily endpoint for frontend reports page
Add ReportsController with /reports/daily endpoint that maps the
dashboard service data to the format expected by the frontend.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:10:47 -08:00
hailin feb871bcf1 feat(mining-admin): add daily report generation service
Add DailyReportService that:
- Generates daily reports on startup
- Updates reports every hour
- Collects stats from synced tables (users, adoptions, contributions, mining, trading)
- Supports historical report generation for backfilling

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 02:03:21 -08:00
hailin 4292d5da66 fix(mining-admin-web): fix TypeScript type for empty mainPools array 2026-01-14 01:55:58 -08:00
hailin a7a2282ba7 fix(mining-admin-web): update account type categorization to match backend
Update categorizeAccounts to use correct account types returned by backend:
- Core accounts: HEADQUARTERS, OPERATION, FEE
- Region accounts: PROVINCE, CITY

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 01:53:11 -08:00
hailin fa6826dde3 fix(mining-admin): use CDC synced tables for system accounts API
Change SystemAccountsService to read from syncedWalletSystemAccount and
syncedWalletPoolAccount tables instead of local tables. This fixes the
issue where the frontend shows "暂无数据" despite data being synced.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 01:44:22 -08:00
hailin eff71a6b22 feat(mining-wallet): publish outbox events for system/pool accounts
Add WalletSystemAccountCreated and WalletPoolAccountCreated events:
- seed.ts: publish events when creating HQ/OP/FEE and pool accounts
- contribution-wallet.service.ts: publish events when auto-creating
  province/city system accounts

This enables mining-admin-service to sync system accounts via CDC.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 01:28:48 -08:00
hailin 0bbb52284c fix(contribution): avoid nested transaction timeout in BonusClaimService
Use unitOfWork.isInTransaction() to detect if already in a transaction
context (called from ContributionCalculationService). If so, reuse the
existing transaction instead of opening a new one, preventing Prisma
interactive transaction timeout errors.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 01:02:08 -08:00
hailin 7588d18fff fix(mining-wallet): fix province/city creation and add seed on startup
- Use provinceCode directly instead of inferring from cityCode
- Use code as name for province/city records
- Add ts-node to production for seed execution
- Run prisma db seed on container startup

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 00:40:49 -08:00
hailin e6e44d9a43 Revert "fix(mining-wallet): auto-create HEADQUARTERS account, skip DEFAULT province/city"
This reverts commit bf004bab52.
2026-01-14 00:19:12 -08:00
hailin bf004bab52 fix(mining-wallet): auto-create HEADQUARTERS account, skip DEFAULT province/city 2026-01-14 00:18:53 -08:00
hailin a03b883350 fix(mining-wallet): exclude prisma directory from TypeScript compilation 2026-01-14 00:07:58 -08:00
hailin 2a79c83715 feat(contribution): implement TEAM_BONUS backfill when unlock conditions met
When a user's direct referral count reaches 2 or 4, the system now automatically
backfills previously pending TEAM_BONUS (T2/T3) contributions that were allocated
to headquarters while waiting for unlock conditions.

- Add BonusClaimService for handling bonus backfill logic
- Add findPendingBonusByAccountSequence and claimBonusRecords to repository
- Integrate bonus claim into updateReferrerUnlockStatus flow
- Add BonusClaimed event consumer in mining-wallet-service
- Generate ledger records for backfilled contributions

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 23:58:54 -08:00
hailin ef330a2687 feat(mining-wallet): add seed and auto-create province/city accounts
- Add prisma seed to initialize core system accounts (HQ, OP, FEE) and pool accounts
- Auto-create province/city system accounts on-demand during contribution distribution
- Province/city regions are also auto-created if not exist

This ensures:
1. Core accounts exist after deployment (via seed)
2. Province/city accounts are created dynamically as orders come in

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 23:36:31 -08:00
hailin 6594845d4c fix(mining-wallet): fix Kafka consumers not subscribing to topics
- Change consumers from @Injectable to @Controller for @EventPattern to work
- Move consumers from providers to controllers array in module
- Add subscribe.fromBeginning config to Kafka microservice

The consumers were not receiving messages because NestJS microservices
require @EventPattern handlers to be in @Controller classes, not just
@Injectable services.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 23:31:31 -08:00
hailin 77b682c8a8 feat(mining-wallet): make initialize endpoints public for internal network calls
Changed system-accounts/initialize and pool-accounts/initialize endpoints from
@AdminOnly to @Public to allow deploy scripts to call them without authentication.
These endpoints are only accessible from internal network.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 23:22:17 -08:00
hailin 6ec79a6672 fix(deploy): correct CDC sync API URL path
Change from /health/cdc-sync to /api/v2/health/cdc-sync

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 22:26:32 -08:00
hailin 631fe2bf31 fix(contribution-service): reset consumer group offsets to earliest on startup
Use admin.resetOffsets({ earliest: true }) before connecting consumer
to ensure CDC sync always starts from the beginning of Kafka topics,
regardless of previously committed offsets.

This fixes the infinite loop issue where existing consumer groups
had committed offsets at high watermark, causing eachMessage to
never be called.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 22:14:51 -08:00
hailin d968efcad4 fix(contribution): run CDC sync in background to allow API access during sync
Change CDC consumer startup from blocking await to non-blocking .then()
so HTTP server starts immediately and /health/cdc-sync API is accessible
for deploy script to poll sync status.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:50:59 -08:00
hailin 5a4970d7d9 Revert "fix(contribution): run CDC sync in background to avoid blocking service startup"
This reverts commit 703c12e9f6.
2026-01-13 21:44:18 -08:00
hailin 703c12e9f6 fix(contribution): run CDC sync in background to avoid blocking service startup
- Change await to .then() for cdcConsumer.start()
- Allows HTTP endpoints to be accessible during CDC sync

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:44:00 -08:00
hailin 8199bc4d66 feat(contribution): add CDC sync status API and fix deploy script timing
- Add initialSyncCompleted flag to track CDC sequential sync completion
- Add getSyncStatus() method to CDCConsumerService
- Add /health/cdc-sync endpoint to expose sync status
- Update deploy-mining.sh to wait for CDC sync completion before calling publish APIs

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:34:58 -08:00
hailin aef6feb2cd fix(contribution): use unique consumer group id for each phase
Previous consumer group had already consumed messages, so fromBeginning
had no effect. Now using timestamp-based unique group id to ensure
fresh consumption from beginning each time.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:11:40 -08:00
hailin 22523aba14 revert: restore blocking await for sequential CDC consumption
The previous change was wrong - running sequential consumption in
background defeats its purpose. The whole point is to ensure data
dependency order (users -> referrals -> adoptions) before any other
operations can proceed.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:07:57 -08:00
hailin a01fd3aa86 fix(contribution): run sequential CDC consumption in background
Prevents blocking NestJS onModuleInit during CDC sync by running
the sequential consumption in the background with error handling.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:07:11 -08:00
hailin d58e8b44ee feat(contribution): implement sequential CDC topic consumption
Implements sequential phase consumption to ensure correct data sync order:
1. User accounts (first)
2. Referral relationships (depends on users)
3. Planting orders (depends on users and referrals)

Each phase must complete before the next starts, guaranteeing 100%
reliable data dependency ordering. After all phases complete, switches
to continuous parallel consumption for real-time updates.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:57:24 -08:00
hailin 30949af577 revert: undo unauthorized ancestor_path and setDirectReferralAdoptedCount changes
Reverts commits:
- 1fbb88f7: setDirectReferralAdoptedCount change
- 471702d5: ancestor_path chain building change

These changes were made without authorization. The original code was correct.
MINING_ENABLED filtering (from dbf97ae4) is preserved.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:46:41 -08:00
hailin 1fbb88f773 fix(contribution): use setDirectReferralAdoptedCount for accurate count update
Changed updateReferrerUnlockStatus to:
1. Create account if not exists (for full-reset scenarios)
2. Use setDirectReferralAdoptedCount instead of increment loop
3. This ensures the count is always accurate regardless of processing order

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:29:53 -08:00
hailin 5eae4464ef fix(mining-app): remove unnecessary token refresh on app startup
Users were being redirected to login page when clicking navigation
because the background token refresh was failing and clearing user state.

Token refresh should only happen when API returns 401, not on every app launch.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:28:07 -08:00
hailin d43a70de93 feat(mining-admin): implement complete system accounts feature
- Add system account types and display metadata
- Create API layer with getList and getSummary endpoints
- Add React Query hooks for data fetching
- Create AccountCard, AccountsTable, SummaryCards components
- Refactor page with tabs, refresh button, and error handling
- Add Alert UI component

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:27:59 -08:00
hailin 471702d562 fix(contribution): use ancestor_path to build upline chain for TEAM_LEVEL distribution
Root cause: CDC sync order issue caused referrerAccountSequence to be null,
resulting in empty ancestor chain and all TEAM_LEVEL contributions going to unallocated.

Changes:
- buildAncestorChainFromReferral: Uses ancestor_path (contains complete user_id chain) to build upline chain
- getDirectReferrer: Gets direct referrer using ancestor_path as fallback
- findAncestorChain: Updated to use ancestor_path when available

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 20:14:46 -08:00
hailin dbf97ae487 fix(contribution-service): filter adoptions by MINING_ENABLED status
Only process adoptions with MINING_ENABLED status for contribution calculation.
This fixes the bug where non-final adoption records (PENDING, PAID, etc.) were
incorrectly being processed, causing duplicate contribution records.

Affected methods:
- findUndistributedAdoptions: only process MINING_ENABLED adoptions
- getDirectReferralAdoptedCount: only count users with MINING_ENABLED adoptions
- getTotalTreesByAccountSequence: only sum trees from MINING_ENABLED adoptions
- getTeamTreesByLevel: only count MINING_ENABLED adoptions
- countUndistributedAdoptions: only count MINING_ENABLED adoptions

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 19:48:34 -08:00
hailin fdfc2d6700 fix(contribution): ensure 100% reliable CDC sync to mining-admin-service
- Add ContributionAccountUpdatedEvent for real-time account updates
- Publish outbox events when saving distribution results
- Publish outbox events when updating adopter/referrer unlock status
- Add incremental sync every 10 minutes for recently updated accounts
- Add daily full sync at 4am as final consistency guarantee
- Add findRecentlyUpdated repository method for incremental sync

Three-layer sync guarantee:
1. Real-time: publish events on every account update
2. Incremental: scan accounts updated in last 15 minutes every 10 mins
3. Full sync: publish all accounts daily at 4am

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 19:27:50 -08:00
hailin 3999d7cc51 fix(contribution): 100% sync CDC data and fix calculation trigger timing
- Remove conditional skip logic in CDC handlers
- Always sync all field updates (including status changes)
- Trigger contribution calculation only when status becomes MINING_ENABLED
- Fix user and referral handlers to sync all fields without skipping

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 16:55:25 -08:00
hailin 20eabbb85f fix(mining-admin): restore MINING_ENABLED status filter for adoption stats
Revert the previous change that removed the status filter. The stats
should only count adoptions with MINING_ENABLED status, as only those
are active for mining. The issue is likely that the status field in
synced_adoptions table doesn't have the correct value.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 01:32:39 -08:00
hailin 65bd4f9b65 fix(mining-admin): remove MINING_ENABLED status filter for adoption stats
The adoption stats were showing 0 because the synced_adoptions table
contains status values directly from 1.0 system (PAID, POOL_INJECTED, etc.)
rather than MINING_ENABLED. Since contribution-service doesn't update the
status after calculating contributions, we now count all synced adoptions.

Changes:
- Remove status filter in getAdoptionStatsForUsers
- Remove status filter in getUserDetail adoption queries
- Remove status filter in getUserAdoptionStats for referral tree
- Add order count display in user detail page

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 01:21:01 -08:00
hailin 2f3a0f3652 feat(mining-admin): display adoption order count in user management
Backend:
- Add personalOrders and teamOrders to adoption stats
- Return order count alongside tree count in user list API

Frontend:
- Add personalAdoptionOrders and teamAdoptionOrders to UserOverview type
- Display format: "树数量(订单数)" e.g. "6(3单)"

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 01:03:59 -08:00
hailin 56ff8290c1 fix(mining-admin): filter adoption stats by MINING_ENABLED status
Only count adoptions with status='MINING_ENABLED' when calculating:
- Personal adoption count (user list)
- Team adoption count (user list)
- Personal adoption stats (user detail)
- Direct referral adoptions (user detail)
- Team adoptions (user detail)
- Referral tree adoption stats

This fixes incorrect adoption counts that included pending/unconfirmed orders.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 00:58:01 -08:00
hailin 1d7d38a82c fix(frontend): prevent redirect to dashboard on page refresh
Fix hydration race condition where token check happened before
localStorage was read. Now waits for client-side initialization
before deciding whether to redirect to login.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 00:25:59 -08:00
361 changed files with 46342 additions and 4002 deletions

View File

@ -767,7 +767,37 @@
"Bash(git -C \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\" commit -m \"$\\(cat <<''EOF''\nfix\\(mining-app\\): update splash page theme and fix token refresh\n\n- Update splash_page.dart to orange theme \\(#FF6B00\\) matching other pages\n- Change app name from \"榴莲挖矿\" to \"榴莲生态\"\n- Fix refreshTokenIfNeeded to properly throw on failure instead of\n silently calling logout \\(which caused Riverpod ref errors\\)\n- Clear local storage directly on refresh failure without remote API call\n\nCo-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>\nEOF\n\\)\")", "Bash(git -C \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\" commit -m \"$\\(cat <<''EOF''\nfix\\(mining-app\\): update splash page theme and fix token refresh\n\n- Update splash_page.dart to orange theme \\(#FF6B00\\) matching other pages\n- Change app name from \"榴莲挖矿\" to \"榴莲生态\"\n- Fix refreshTokenIfNeeded to properly throw on failure instead of\n silently calling logout \\(which caused Riverpod ref errors\\)\n- Clear local storage directly on refresh failure without remote API call\n\nCo-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>\nEOF\n\\)\")",
"Bash(python3 -c \" import sys content = sys.stdin.read\\(\\) old = '''''' done # 清空 processed_cdc_events 表(因为 migration 时可能已经消费了一些消息) # 这是事务性幂等消费的关键:重置 Kafka offset 后必须同时清空幂等记录 log_info \"\"Truncating processed_cdc_events tables to allow re-consumption...\"\" for db in \"\"rwa_contribution\"\" \"\"rwa_auth\"\"; do if run_psql \"\"$db\"\" \"\"TRUNCATE TABLE processed_cdc_events;\"\" 2>/dev/null; then log_success \"\"Truncated processed_cdc_events in $db\"\" else log_warn \"\"Could not truncate processed_cdc_events in $db \\(table may not exist yet\\)\"\" fi done log_step \"\"Step 9/18: Starting 2.0 services...\"\"'''''' new = '''''' done # 清空 processed_cdc_events 表(因为 migration 时可能已经消费了一些消息) # 这是事务性幂等消费的关键:重置 Kafka offset 后必须同时清空幂等记录 log_info \"\"Truncating processed_cdc_events tables to allow re-consumption...\"\" for db in \"\"rwa_contribution\"\" \"\"rwa_auth\"\"; do if run_psql \"\"$db\"\" \"\"TRUNCATE TABLE processed_cdc_events;\"\" 2>/dev/null; then log_success \"\"Truncated processed_cdc_events in $db\"\" else log_warn \"\"Could not truncate processed_cdc_events in $db \\(table may not exist yet\\)\"\" fi done log_step \"\"Step 9/18: Starting 2.0 services...\"\"'''''' print\\(content.replace\\(old, new\\)\\) \")", "Bash(python3 -c \" import sys content = sys.stdin.read\\(\\) old = '''''' done # 清空 processed_cdc_events 表(因为 migration 时可能已经消费了一些消息) # 这是事务性幂等消费的关键:重置 Kafka offset 后必须同时清空幂等记录 log_info \"\"Truncating processed_cdc_events tables to allow re-consumption...\"\" for db in \"\"rwa_contribution\"\" \"\"rwa_auth\"\"; do if run_psql \"\"$db\"\" \"\"TRUNCATE TABLE processed_cdc_events;\"\" 2>/dev/null; then log_success \"\"Truncated processed_cdc_events in $db\"\" else log_warn \"\"Could not truncate processed_cdc_events in $db \\(table may not exist yet\\)\"\" fi done log_step \"\"Step 9/18: Starting 2.0 services...\"\"'''''' new = '''''' done # 清空 processed_cdc_events 表(因为 migration 时可能已经消费了一些消息) # 这是事务性幂等消费的关键:重置 Kafka offset 后必须同时清空幂等记录 log_info \"\"Truncating processed_cdc_events tables to allow re-consumption...\"\" for db in \"\"rwa_contribution\"\" \"\"rwa_auth\"\"; do if run_psql \"\"$db\"\" \"\"TRUNCATE TABLE processed_cdc_events;\"\" 2>/dev/null; then log_success \"\"Truncated processed_cdc_events in $db\"\" else log_warn \"\"Could not truncate processed_cdc_events in $db \\(table may not exist yet\\)\"\" fi done log_step \"\"Step 9/18: Starting 2.0 services...\"\"'''''' print\\(content.replace\\(old, new\\)\\) \")",
"Bash(git rm:*)", "Bash(git rm:*)",
"Bash(echo \"请在服务器运行以下命令检查 outbox 事件:\n\ndocker exec -it rwa-postgres psql -U rwa_user -d rwa_contribution -c \"\"\nSELECT id, event_type, aggregate_id, \n payload->>''sourceType'' as source_type,\n payload->>''accountSequence'' as account_seq,\n payload->>''sourceAccountSequence'' as source_account_seq,\n payload->>''bonusTier'' as bonus_tier\nFROM outbox_events \nWHERE payload->>''accountSequence'' = ''D25122900007''\nORDER BY id;\n\"\"\")" "Bash(echo \"请在服务器运行以下命令检查 outbox 事件:\n\ndocker exec -it rwa-postgres psql -U rwa_user -d rwa_contribution -c \"\"\nSELECT id, event_type, aggregate_id, \n payload->>''sourceType'' as source_type,\n payload->>''accountSequence'' as account_seq,\n payload->>''sourceAccountSequence'' as source_account_seq,\n payload->>''bonusTier'' as bonus_tier\nFROM outbox_events \nWHERE payload->>''accountSequence'' = ''D25122900007''\nORDER BY id;\n\"\"\")",
"Bash(ssh -o ConnectTimeout=10 ceshi@14.215.128.96 'find /home/ceshi/rwadurian/frontend/mining-admin-web -name \"\"*.tsx\"\" -o -name \"\"*.ts\"\" | xargs grep -l \"\"用户管理\\\\|users\"\" 2>/dev/null | head -10')",
"Bash(dir /s /b \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\")",
"Bash(dir /b \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\\\\backend\\\\services\")",
"Bash(ssh -J ceshi@103.39.231.231 ceshi@192.168.1.111 \"curl -s http://localhost:3021/api/v2/admin/status\")",
"Bash(del \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\\\\frontend\\\\mining-app\\\\lib\\\\domain\\\\usecases\\\\trading\\\\buy_shares.dart\")",
"Bash(del \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\\\\frontend\\\\mining-app\\\\lib\\\\domain\\\\usecases\\\\trading\\\\sell_shares.dart\")",
"Bash(ls -la \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\\\\frontend\\\\mining-app\\\\lib\\\\presentation\\\\pages\"\" 2>/dev/null || dir /b \"c:UsersdongDesktoprwadurianfrontendmining-applibpresentationpages \")",
"Bash(cd:*)",
"Bash(ssh -o StrictHostKeyChecking=no -J ceshi@103.39.231.231 ceshi@192.168.1.111 \"curl -s http://localhost:3020/api/v1/ | head -100\")",
"Bash(ssh -o StrictHostKeyChecking=no -J ceshi@103.39.231.231 ceshi@192.168.1.111:*)",
"Bash(bc:*)",
"Bash(DATABASE_URL=\"postgresql://postgres:password@localhost:5432/mining_db?schema=public\" npx prisma migrate diff:*)",
"Bash(git status:*)",
"Bash(xargs cat:*)",
"Bash(ssh -o ProxyJump=ceshi@103.39.231.231 ceshi@192.168.1.111 \"docker ps | grep mining\")",
"Bash(dir /b \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\\\\backend\\\\services\\\\trading-service\\\\src\\\\application\\\\services\")",
"Bash(DATABASE_URL=\"postgresql://postgres:password@localhost:5432/trading_db?schema=public\" npx prisma migrate dev:*)",
"Bash(dir /b \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\\\\backend\\\\services\\\\mining-admin-service\\\\src\")",
"Bash(ssh -o ProxyJump=ceshi@103.39.231.231 ceshi@192.168.1.111 \"cd /home/ceshi/rwadurian/backend/service && ls -la\")",
"Bash(ssh -o ProxyJump=ceshi@103.39.231.231 ceshi@192.168.1.111 \"ls -la /home/ceshi/rwadurian/backend/\")",
"Bash(ssh -o ProxyJump=ceshi@103.39.231.231 ceshi@192.168.1.111 \"ls -la /home/ceshi/rwadurian/backend/services/\")",
"Bash(where:*)",
"Bash(npx md-to-pdf:*)",
"Bash(ssh -J ceshi@103.39.231.231 ceshi@192.168.1.111 \"curl -s ''http://localhost:3000/api/price/klines?period=1h&limit=5'' | head -500\")",
"Bash(dir /b /ad \"c:\\\\Users\\\\dong\\\\Desktop\\\\rwadurian\\\\backend\")",
"Bash(timeout 30 cat:*)",
"Bash(npm run lint)",
"Bash(ssh -o ProxyCommand=\"ssh -W %h:%p ceshi@103.39.231.231\" -o StrictHostKeyChecking=no ceshi@192.168.1.111 \"cat /home/ceshi/rwadurian/backend/services/mining-service/src/application/services/batch-mining.service.ts | head -250\")",
"Bash(ssh -o ProxyCommand=\"ssh -W %h:%p ceshi@103.39.231.231\" -o StrictHostKeyChecking=no ceshi@192.168.1.111 \"docker logs rwa-mining-admin-service --tail 50 2>&1 | grep ''第一条数据\\\\|最后一条数据''\")",
"Bash(npx xlsx-cli 挖矿.xlsx)"
], ],
"deny": [], "deny": [],
"ask": [] "ask": []

View File

@ -309,24 +309,42 @@ services:
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# Trading Service 2.0 - 交易服务 # Trading Service 2.0 - 交易服务
# 前端路径: /api/v2/trading/... -> 后端路径: /api/v2/...
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
- name: trading-service-v2 - name: trading-service-v2
url: http://192.168.1.111:3022 url: http://192.168.1.111:3022/api/v2
routes: routes:
- name: trading-v2-api - name: trading-v2-api
paths: paths:
- /api/v2/trading - /api/v2/trading
strip_path: false strip_path: true
- name: trading-v2-health - name: trading-v2-health
paths: paths:
- /api/v2/trading/health - /api/v2/trading/health
strip_path: false strip_path: true
# ---------------------------------------------------------------------------
# Trading Service WebSocket - 价格实时推送
# WebSocket 连接: wss://api.xxx.com/ws/price -> ws://192.168.1.111:3022/price
# Kong 会自动处理 HTTP -> WebSocket 升级,所以 protocols 只需要 http/https
# ---------------------------------------------------------------------------
- name: trading-ws-service
url: http://192.168.1.111:3022
routes:
- name: trading-ws-price
paths:
- /ws/price
strip_path: true
protocols:
- http
- https
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# Mining Admin Service 2.0 - 挖矿管理后台服务 # Mining Admin Service 2.0 - 挖矿管理后台服务
# 前端路径: /api/v2/mining-admin/... -> 后端路径: /api/v2/...
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
- name: mining-admin-service - name: mining-admin-service
url: http://192.168.1.111:3023/api/v1 url: http://192.168.1.111:3023/api/v2
routes: routes:
- name: mining-admin-api - name: mining-admin-api
paths: paths:
@ -356,18 +374,19 @@ services:
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# Mining Wallet Service 2.0 - 挖矿钱包服务 # Mining Wallet Service 2.0 - 挖矿钱包服务
# 前端路径: /api/v2/mining-wallet/... -> 后端路径: /api/v2/...
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
- name: mining-wallet-service - name: mining-wallet-service
url: http://192.168.1.111:3025 url: http://192.168.1.111:3025/api/v2
routes: routes:
- name: mining-wallet-api - name: mining-wallet-api
paths: paths:
- /api/v2/mining-wallet - /api/v2/mining-wallet
strip_path: false strip_path: true
- name: mining-wallet-health - name: mining-wallet-health
paths: paths:
- /api/v2/mining-wallet/health - /api/v2/mining-wallet/health
strip_path: false strip_path: true
# ============================================================================= # =============================================================================
# Plugins - 全局插件配置 # Plugins - 全局插件配置

View File

@ -39,8 +39,9 @@ android {
} }
// NDK configuration for TSS native library // NDK configuration for TSS native library
// Only include ARM ABIs for real devices (x86_64 is for emulators only)
ndk { ndk {
abiFilters += listOf("arm64-v8a", "armeabi-v7a", "x86_64") abiFilters += listOf("arm64-v8a", "armeabi-v7a")
} }
} }

View File

@ -29,6 +29,9 @@ data class ShareRecordEntity(
@ColumnInfo(name = "party_index") @ColumnInfo(name = "party_index")
val partyIndex: Int, val partyIndex: Int,
@ColumnInfo(name = "party_id")
val partyId: String, // The original partyId used during keygen - required for signing
@ColumnInfo(name = "address") @ColumnInfo(name = "address")
val address: String, val address: String,
@ -90,15 +93,159 @@ interface AppSettingDao {
suspend fun setValue(setting: AppSettingEntity) suspend fun setValue(setting: AppSettingEntity)
} }
/**
* 转账记录数据库实体
* Entity for storing transaction history records
*/
@Entity(
tableName = "transaction_records",
foreignKeys = [
ForeignKey(
entity = ShareRecordEntity::class,
parentColumns = ["id"],
childColumns = ["share_id"],
onDelete = ForeignKey.CASCADE // 删除钱包时自动删除关联的转账记录
)
],
indices = [
Index(value = ["share_id"]),
Index(value = ["tx_hash"], unique = true),
Index(value = ["from_address"]),
Index(value = ["to_address"]),
Index(value = ["created_at"])
]
)
data class TransactionRecordEntity(
@PrimaryKey(autoGenerate = true)
val id: Long = 0,
@ColumnInfo(name = "share_id")
val shareId: Long, // 关联的钱包ID
@ColumnInfo(name = "from_address")
val fromAddress: String, // 发送方地址
@ColumnInfo(name = "to_address")
val toAddress: String, // 接收方地址
@ColumnInfo(name = "amount")
val amount: String, // 转账金额(人类可读格式)
@ColumnInfo(name = "token_type")
val tokenType: String, // 代币类型KAVA, GREEN_POINTS, ENERGY_POINTS, FUTURE_POINTS
@ColumnInfo(name = "tx_hash")
val txHash: String, // 交易哈希
@ColumnInfo(name = "gas_price")
val gasPrice: String, // Gas 价格Wei
@ColumnInfo(name = "gas_used")
val gasUsed: String = "", // 实际消耗的 Gas
@ColumnInfo(name = "tx_fee")
val txFee: String = "", // 交易手续费
@ColumnInfo(name = "status")
val status: String, // 交易状态PENDING, CONFIRMED, FAILED
@ColumnInfo(name = "direction")
val direction: String, // 交易方向SENT, RECEIVED
@ColumnInfo(name = "note")
val note: String = "", // 备注
@ColumnInfo(name = "created_at")
val createdAt: Long = System.currentTimeMillis(),
@ColumnInfo(name = "confirmed_at")
val confirmedAt: Long? = null, // 确认时间
@ColumnInfo(name = "block_number")
val blockNumber: Long? = null // 区块高度
)
/**
* 转账记录 DAO
* Data Access Object for transaction records
*/
@Dao
interface TransactionRecordDao {
@Insert(onConflict = OnConflictStrategy.REPLACE)
suspend fun insertRecord(record: TransactionRecordEntity): Long
@Query("SELECT * FROM transaction_records WHERE id = :id")
suspend fun getRecordById(id: Long): TransactionRecordEntity?
@Query("SELECT * FROM transaction_records WHERE tx_hash = :txHash")
suspend fun getRecordByTxHash(txHash: String): TransactionRecordEntity?
@Query("SELECT * FROM transaction_records WHERE share_id = :shareId ORDER BY created_at DESC")
fun getRecordsForShare(shareId: Long): Flow<List<TransactionRecordEntity>>
@Query("SELECT * FROM transaction_records WHERE share_id = :shareId ORDER BY created_at DESC LIMIT :limit OFFSET :offset")
suspend fun getRecordsForSharePaged(shareId: Long, limit: Int, offset: Int): List<TransactionRecordEntity>
@Query("SELECT * FROM transaction_records WHERE share_id = :shareId AND token_type = :tokenType ORDER BY created_at DESC")
fun getRecordsForShareByToken(shareId: Long, tokenType: String): Flow<List<TransactionRecordEntity>>
@Query("SELECT * FROM transaction_records WHERE status = 'PENDING' ORDER BY created_at ASC")
suspend fun getPendingRecords(): List<TransactionRecordEntity>
@Query("UPDATE transaction_records SET status = :status, confirmed_at = :confirmedAt, block_number = :blockNumber, gas_used = :gasUsed, tx_fee = :txFee WHERE id = :id")
suspend fun updateStatus(id: Long, status: String, confirmedAt: Long?, blockNumber: Long?, gasUsed: String, txFee: String)
@Query("""
SELECT
COUNT(*) as total_count,
SUM(CASE WHEN direction = 'SENT' THEN 1 ELSE 0 END) as sent_count,
SUM(CASE WHEN direction = 'RECEIVED' THEN 1 ELSE 0 END) as received_count
FROM transaction_records
WHERE share_id = :shareId AND token_type = :tokenType
""")
suspend fun getTransactionStats(shareId: Long, tokenType: String): TransactionStats
@Query("SELECT COALESCE(SUM(CAST(amount AS REAL)), 0) FROM transaction_records WHERE share_id = :shareId AND token_type = :tokenType AND direction = 'SENT' AND status = 'CONFIRMED'")
suspend fun getTotalSentAmount(shareId: Long, tokenType: String): Double
@Query("SELECT COALESCE(SUM(CAST(amount AS REAL)), 0) FROM transaction_records WHERE share_id = :shareId AND token_type = :tokenType AND direction = 'RECEIVED' AND status = 'CONFIRMED'")
suspend fun getTotalReceivedAmount(shareId: Long, tokenType: String): Double
@Query("SELECT COALESCE(SUM(CAST(tx_fee AS REAL)), 0) FROM transaction_records WHERE share_id = :shareId AND direction = 'SENT' AND status = 'CONFIRMED'")
suspend fun getTotalTxFee(shareId: Long): Double
@Query("DELETE FROM transaction_records WHERE id = :id")
suspend fun deleteRecordById(id: Long)
@Query("DELETE FROM transaction_records WHERE share_id = :shareId")
suspend fun deleteRecordsForShare(shareId: Long)
@Query("SELECT COUNT(*) FROM transaction_records WHERE share_id = :shareId")
suspend fun getRecordCount(shareId: Long): Int
}
/**
* 转账统计数据类
*/
data class TransactionStats(
@ColumnInfo(name = "total_count")
val totalCount: Int,
@ColumnInfo(name = "sent_count")
val sentCount: Int,
@ColumnInfo(name = "received_count")
val receivedCount: Int
)
/** /**
* Room database * Room database
*/ */
@Database( @Database(
entities = [ShareRecordEntity::class, AppSettingEntity::class], entities = [ShareRecordEntity::class, AppSettingEntity::class, TransactionRecordEntity::class],
version = 2, version = 4, // Version 4: added transaction_records table for transfer history
exportSchema = false exportSchema = false
) )
abstract class TssDatabase : RoomDatabase() { abstract class TssDatabase : RoomDatabase() {
abstract fun shareRecordDao(): ShareRecordDao abstract fun shareRecordDao(): ShareRecordDao
abstract fun appSettingDao(): AppSettingDao abstract fun appSettingDao(): AppSettingDao
abstract fun transactionRecordDao(): TransactionRecordDao
} }

View File

@ -5,6 +5,8 @@ import com.durian.tssparty.data.local.AppSettingDao
import com.durian.tssparty.data.local.AppSettingEntity import com.durian.tssparty.data.local.AppSettingEntity
import com.durian.tssparty.data.local.ShareRecordDao import com.durian.tssparty.data.local.ShareRecordDao
import com.durian.tssparty.data.local.ShareRecordEntity import com.durian.tssparty.data.local.ShareRecordEntity
import com.durian.tssparty.data.local.TransactionRecordDao
import com.durian.tssparty.data.local.TransactionRecordEntity
import com.durian.tssparty.data.local.TssNativeBridge import com.durian.tssparty.data.local.TssNativeBridge
import com.durian.tssparty.data.remote.GrpcClient import com.durian.tssparty.data.remote.GrpcClient
import com.durian.tssparty.data.remote.GrpcConnectionEvent import com.durian.tssparty.data.remote.GrpcConnectionEvent
@ -31,7 +33,8 @@ class TssRepository @Inject constructor(
private val grpcClient: GrpcClient, private val grpcClient: GrpcClient,
private val tssNativeBridge: TssNativeBridge, private val tssNativeBridge: TssNativeBridge,
private val shareRecordDao: ShareRecordDao, private val shareRecordDao: ShareRecordDao,
private val appSettingDao: AppSettingDao private val appSettingDao: AppSettingDao,
private val transactionRecordDao: TransactionRecordDao
) { ) {
private val _currentSession = MutableStateFlow<TssSession?>(null) private val _currentSession = MutableStateFlow<TssSession?>(null)
val currentSession: StateFlow<TssSession?> = _currentSession.asStateFlow() val currentSession: StateFlow<TssSession?> = _currentSession.asStateFlow()
@ -48,6 +51,12 @@ class TssRepository @Inject constructor(
// partyId is loaded once from database in registerParty() and cached here // partyId is loaded once from database in registerParty() and cached here
// This matches Electron's getOrCreatePartyId() pattern // This matches Electron's getOrCreatePartyId() pattern
private lateinit var partyId: String private lateinit var partyId: String
// currentSigningPartyId: The partyId to use for the current signing session
// This may differ from partyId when signing with a restored wallet backup
// CRITICAL: For backup/restore to work, signing must use the original partyId from keygen
private var currentSigningPartyId: String? = null
private var messageCollectionJob: Job? = null private var messageCollectionJob: Job? = null
private var sessionEventJob: Job? = null private var sessionEventJob: Job? = null
@ -1051,6 +1060,7 @@ class TssRepository @Inject constructor(
val address = AddressUtils.deriveEvmAddress(publicKeyBytes) val address = AddressUtils.deriveEvmAddress(publicKeyBytes)
// Save share record (use actual thresholds and party index from backend) // Save share record (use actual thresholds and party index from backend)
// CRITICAL: Save partyId - this is required for signing after backup/restore
val shareEntity = ShareRecordEntity( val shareEntity = ShareRecordEntity(
sessionId = sessionId, sessionId = sessionId,
publicKey = result.publicKey, publicKey = result.publicKey,
@ -1058,6 +1068,7 @@ class TssRepository @Inject constructor(
thresholdT = actualThresholdT, thresholdT = actualThresholdT,
thresholdN = actualThresholdN, thresholdN = actualThresholdN,
partyIndex = actualPartyIndex, partyIndex = actualPartyIndex,
partyId = partyId,
address = address address = address
) )
val id = shareRecordDao.insertShare(shareEntity) val id = shareRecordDao.insertShare(shareEntity)
@ -1115,14 +1126,26 @@ class TssRepository @Inject constructor(
// Note: Password is verified during actual sign execution, same as Electron // Note: Password is verified during actual sign execution, same as Electron
// CRITICAL: Use the original partyId from the share (keygen time) for signing
// This is essential for backup/restore - the partyId must match what was used during keygen
// If shareEntity.partyId is empty (legacy data), fall back to current device's partyId
val signingPartyId = if (shareEntity.partyId.isNotEmpty()) {
shareEntity.partyId
} else {
android.util.Log.w("TssRepository", "Share has no partyId (legacy data), using current device partyId")
partyId
}
currentSigningPartyId = signingPartyId // Save for later use in this flow
android.util.Log.d("TssRepository", "Using signingPartyId=$signingPartyId (current device partyId=$partyId)")
// CRITICAL: Set pendingSessionId BEFORE joinSession to avoid race condition // CRITICAL: Set pendingSessionId BEFORE joinSession to avoid race condition
// This ensures session_started events can be matched even if they arrive // This ensures session_started events can be matched even if they arrive
// before _currentSession is set // before _currentSession is set
pendingSessionId = sessionId pendingSessionId = sessionId
android.util.Log.d("TssRepository", "Set pendingSessionId=$sessionId for event matching (sign joiner)") android.util.Log.d("TssRepository", "Set pendingSessionId=$sessionId for event matching (sign joiner)")
// Join session via gRPC (matching Electron's grpcClient.joinSession) // Join session via gRPC using the original partyId from keygen (CRITICAL for backup/restore)
val joinResult = grpcClient.joinSession(sessionId, partyId, joinToken) val joinResult = grpcClient.joinSession(sessionId, signingPartyId, joinToken)
if (joinResult.isFailure) { if (joinResult.isFailure) {
android.util.Log.e("TssRepository", "gRPC sign join failed", joinResult.exceptionOrNull()) android.util.Log.e("TssRepository", "gRPC sign join failed", joinResult.exceptionOrNull())
return@withContext Result.failure(joinResult.exceptionOrNull()!!) return@withContext Result.failure(joinResult.exceptionOrNull()!!)
@ -1137,12 +1160,13 @@ class TssRepository @Inject constructor(
// Build participants list (matching Electron's logic) // Build participants list (matching Electron's logic)
// Prefer using parties from validateInviteCode (complete list) // Prefer using parties from validateInviteCode (complete list)
// CRITICAL: Use signingPartyId (original partyId from keygen) for participant identification
val participants = if (parties.isNotEmpty()) { val participants = if (parties.isNotEmpty()) {
parties.toMutableList() parties.toMutableList()
} else { } else {
// Fallback: use other_parties + self // Fallback: use other_parties + self
val list = sessionData.participants.toMutableList() val list = sessionData.participants.toMutableList()
list.add(Participant(partyId, myPartyIndex, "")) list.add(Participant(signingPartyId, myPartyIndex, ""))
list.sortBy { it.partyIndex } list.sortBy { it.partyIndex }
list list
} }
@ -1222,10 +1246,14 @@ class TssRepository @Inject constructor(
} else { } else {
messageHash messageHash
} }
android.util.Log.d("TssRepository", "Starting TSS sign with cleanMessageHash=${cleanMessageHash.take(20)}...") // CRITICAL: Use shareEntity.partyId (original partyId from keygen) for signing
// This is required for backup/restore to work - the partyId must match what was used during keygen
val signingPartyId = shareEntity.partyId
currentSigningPartyId = signingPartyId // Save for later use in this flow
android.util.Log.d("TssRepository", "Starting TSS sign with cleanMessageHash=${cleanMessageHash.take(20)}..., signingPartyId=$signingPartyId")
val startResult = tssNativeBridge.startSign( val startResult = tssNativeBridge.startSign(
sessionId = sessionId, sessionId = sessionId,
partyId = partyId, partyId = signingPartyId,
partyIndex = partyIndex, partyIndex = partyIndex,
thresholdT = thresholdT, thresholdT = thresholdT,
thresholdN = shareEntity.thresholdN, // Use original N from keygen thresholdN = shareEntity.thresholdN, // Use original N from keygen
@ -1243,8 +1271,8 @@ class TssRepository @Inject constructor(
// Start collecting progress from native bridge // Start collecting progress from native bridge
startProgressCollection() startProgressCollection()
// Mark ready // Mark ready - use signingPartyId (original partyId from keygen)
grpcClient.markPartyReady(sessionId, partyId) grpcClient.markPartyReady(sessionId, signingPartyId)
// Wait for sign result // Wait for sign result
val signResult = tssNativeBridge.waitForSignResult() val signResult = tssNativeBridge.waitForSignResult()
@ -1256,14 +1284,15 @@ class TssRepository @Inject constructor(
val result = signResult.getOrThrow() val result = signResult.getOrThrow()
// Report completion // Report completion - use signingPartyId (original partyId from keygen)
val signatureBytes = android.util.Base64.decode(result.signature, android.util.Base64.NO_WRAP) val signatureBytes = android.util.Base64.decode(result.signature, android.util.Base64.NO_WRAP)
grpcClient.reportCompletion(sessionId, partyId, signature = signatureBytes) grpcClient.reportCompletion(sessionId, signingPartyId, signature = signatureBytes)
stopProgressCollection() stopProgressCollection()
_sessionStatus.value = SessionStatus.COMPLETED _sessionStatus.value = SessionStatus.COMPLETED
pendingSessionId = null // Clear pending session ID on completion pendingSessionId = null // Clear pending session ID on completion
messageCollectionJob?.cancel() messageCollectionJob?.cancel()
currentSigningPartyId = null // Clear after signing completes
android.util.Log.d("TssRepository", "Sign as joiner completed: signature=${result.signature.take(20)}...") android.util.Log.d("TssRepository", "Sign as joiner completed: signature=${result.signature.take(20)}...")
@ -1274,6 +1303,7 @@ class TssRepository @Inject constructor(
stopProgressCollection() stopProgressCollection()
_sessionStatus.value = SessionStatus.FAILED _sessionStatus.value = SessionStatus.FAILED
pendingSessionId = null // Clear pending session ID on failure pendingSessionId = null // Clear pending session ID on failure
currentSigningPartyId = null // Clear on failure too
Result.failure(e) Result.failure(e)
} }
} }
@ -1366,6 +1396,7 @@ class TssRepository @Inject constructor(
val address = AddressUtils.deriveEvmAddress(publicKeyBytes) val address = AddressUtils.deriveEvmAddress(publicKeyBytes)
// Save share record // Save share record
// CRITICAL: Save partyId - this is required for signing after backup/restore
val shareEntity = ShareRecordEntity( val shareEntity = ShareRecordEntity(
sessionId = apiJoinData.sessionId, sessionId = apiJoinData.sessionId,
publicKey = result.publicKey, publicKey = result.publicKey,
@ -1373,6 +1404,7 @@ class TssRepository @Inject constructor(
thresholdT = apiJoinData.thresholdT, thresholdT = apiJoinData.thresholdT,
thresholdN = apiJoinData.thresholdN, thresholdN = apiJoinData.thresholdN,
partyIndex = myPartyIndex, partyIndex = myPartyIndex,
partyId = partyId,
address = address address = address
) )
val id = shareRecordDao.insertShare(shareEntity) val id = shareRecordDao.insertShare(shareEntity)
@ -1516,12 +1548,15 @@ class TssRepository @Inject constructor(
_sessionStatus.value = SessionStatus.WAITING _sessionStatus.value = SessionStatus.WAITING
// Add self to participants // Add self to participants
val allParticipants = sessionData.participants + Participant(partyId, myPartyIndex) // CRITICAL: Use shareEntity.partyId (original partyId from keygen) for signing
val signingPartyId = shareEntity.partyId
currentSigningPartyId = signingPartyId // Save for later use in this flow
val allParticipants = sessionData.participants + Participant(signingPartyId, myPartyIndex)
// Start TSS sign // Start TSS sign
val startResult = tssNativeBridge.startSign( val startResult = tssNativeBridge.startSign(
sessionId = apiJoinData.sessionId, sessionId = apiJoinData.sessionId,
partyId = partyId, partyId = signingPartyId,
partyIndex = myPartyIndex, partyIndex = myPartyIndex,
thresholdT = apiJoinData.thresholdT, thresholdT = apiJoinData.thresholdT,
thresholdN = shareEntity.thresholdN, // Use original N from keygen thresholdN = shareEntity.thresholdN, // Use original N from keygen
@ -1540,8 +1575,8 @@ class TssRepository @Inject constructor(
// Start message routing // Start message routing
startMessageRouting(apiJoinData.sessionId, myPartyIndex) startMessageRouting(apiJoinData.sessionId, myPartyIndex)
// Mark ready // Mark ready - use signingPartyId (original partyId from keygen)
grpcClient.markPartyReady(apiJoinData.sessionId, partyId) grpcClient.markPartyReady(apiJoinData.sessionId, signingPartyId)
// Wait for sign result // Wait for sign result
val signResult = tssNativeBridge.waitForSignResult() val signResult = tssNativeBridge.waitForSignResult()
@ -1552,18 +1587,20 @@ class TssRepository @Inject constructor(
val result = signResult.getOrThrow() val result = signResult.getOrThrow()
// Report completion // Report completion - use signingPartyId (original partyId from keygen)
val signatureBytes = Base64.decode(result.signature, Base64.NO_WRAP) val signatureBytes = Base64.decode(result.signature, Base64.NO_WRAP)
grpcClient.reportCompletion(apiJoinData.sessionId, partyId, signature = signatureBytes) grpcClient.reportCompletion(apiJoinData.sessionId, signingPartyId, signature = signatureBytes)
_sessionStatus.value = SessionStatus.COMPLETED _sessionStatus.value = SessionStatus.COMPLETED
messageCollectionJob?.cancel() messageCollectionJob?.cancel()
currentSigningPartyId = null // Clear after signing completes
Result.success(result) Result.success(result)
} catch (e: Exception) { } catch (e: Exception) {
android.util.Log.e("TssRepository", "Join sign session failed", e) android.util.Log.e("TssRepository", "Join sign session failed", e)
_sessionStatus.value = SessionStatus.FAILED _sessionStatus.value = SessionStatus.FAILED
currentSigningPartyId = null // Clear on failure too
Result.failure(e) Result.failure(e)
} }
} }
@ -1785,6 +1822,7 @@ class TssRepository @Inject constructor(
val address = AddressUtils.deriveEvmAddress(publicKeyBytes) val address = AddressUtils.deriveEvmAddress(publicKeyBytes)
// Save share record (use actual thresholds from backend) // Save share record (use actual thresholds from backend)
// CRITICAL: Save partyId - this is required for signing after backup/restore
val shareEntity = ShareRecordEntity( val shareEntity = ShareRecordEntity(
sessionId = sessionId, sessionId = sessionId,
publicKey = result.publicKey, publicKey = result.publicKey,
@ -1792,6 +1830,7 @@ class TssRepository @Inject constructor(
thresholdT = actualThresholdT, thresholdT = actualThresholdT,
thresholdN = actualThresholdN, thresholdN = actualThresholdN,
partyIndex = myPartyIndex, partyIndex = myPartyIndex,
partyId = partyId,
address = address address = address
) )
val id = shareRecordDao.insertShare(shareEntity) val id = shareRecordDao.insertShare(shareEntity)
@ -1900,6 +1939,7 @@ class TssRepository @Inject constructor(
} }
// Convert to entity and save // Convert to entity and save
// CRITICAL: Preserve the original partyId from backup - this is required for signing
val shareRecord = backup.toShareRecord() val shareRecord = backup.toShareRecord()
val entity = ShareRecordEntity( val entity = ShareRecordEntity(
sessionId = shareRecord.sessionId, sessionId = shareRecord.sessionId,
@ -1908,6 +1948,7 @@ class TssRepository @Inject constructor(
thresholdT = shareRecord.thresholdT, thresholdT = shareRecord.thresholdT,
thresholdN = shareRecord.thresholdN, thresholdN = shareRecord.thresholdN,
partyIndex = shareRecord.partyIndex, partyIndex = shareRecord.partyIndex,
partyId = shareRecord.partyId,
address = shareRecord.address, address = shareRecord.address,
createdAt = shareRecord.createdAt createdAt = shareRecord.createdAt
) )
@ -1915,7 +1956,7 @@ class TssRepository @Inject constructor(
val newId = shareRecordDao.insertShare(entity) val newId = shareRecordDao.insertShare(entity)
val savedShare = shareRecord.copy(id = newId) val savedShare = shareRecord.copy(id = newId)
android.util.Log.d("TssRepository", "Imported share backup for address: ${backup.address}") android.util.Log.d("TssRepository", "Imported share backup for address: ${backup.address}, partyId: ${backup.partyId}")
Result.success(savedShare) Result.success(savedShare)
} catch (e: com.google.gson.JsonSyntaxException) { } catch (e: com.google.gson.JsonSyntaxException) {
android.util.Log.e("TssRepository", "Invalid JSON format in backup", e) android.util.Log.e("TssRepository", "Invalid JSON format in backup", e)
@ -1971,10 +2012,19 @@ class TssRepository @Inject constructor(
} }
/** /**
* Get Green Points (绿积分/dUSDT) token balance for an address * Get ERC-20 token balance for an address
* Uses eth_call to call balanceOf(address) on the ERC-20 contract * Uses eth_call to call balanceOf(address) on the ERC-20 contract
* @param address The wallet address
* @param rpcUrl The RPC endpoint URL
* @param contractAddress The ERC-20 token contract address
* @param decimals The token decimals (default 6 for USDT-like tokens)
*/ */
suspend fun getGreenPointsBalance(address: String, rpcUrl: String): Result<String> { suspend fun getERC20Balance(
address: String,
rpcUrl: String,
contractAddress: String,
decimals: Int = 6
): Result<String> {
return withContext(Dispatchers.IO) { return withContext(Dispatchers.IO) {
try { try {
val client = okhttp3.OkHttpClient() val client = okhttp3.OkHttpClient()
@ -1984,14 +2034,14 @@ class TssRepository @Inject constructor(
// Function selector: 0x70a08231 // Function selector: 0x70a08231
// Address parameter: padded to 32 bytes // Address parameter: padded to 32 bytes
val paddedAddress = address.removePrefix("0x").lowercase().padStart(64, '0') val paddedAddress = address.removePrefix("0x").lowercase().padStart(64, '0')
val callData = "${GreenPointsToken.BALANCE_OF_SELECTOR}$paddedAddress" val callData = "${ERC20Selectors.BALANCE_OF}$paddedAddress"
val requestBody = """ val requestBody = """
{ {
"jsonrpc": "2.0", "jsonrpc": "2.0",
"method": "eth_call", "method": "eth_call",
"params": [{ "params": [{
"to": "${GreenPointsToken.CONTRACT_ADDRESS}", "to": "$contractAddress",
"data": "$callData" "data": "$callData"
}, "latest"], }, "latest"],
"id": 1 "id": 1
@ -2013,42 +2063,88 @@ class TssRepository @Inject constructor(
} }
val hexBalance = json.get("result").asString val hexBalance = json.get("result").asString
// Convert hex to decimal, then apply 6 decimals (dUSDT uses 6 decimals like USDT) // Convert hex to decimal, then apply decimals
val rawBalance = java.math.BigInteger(hexBalance.removePrefix("0x"), 16) val rawBalance = java.math.BigInteger(hexBalance.removePrefix("0x"), 16)
val divisor = java.math.BigDecimal.TEN.pow(decimals)
val tokenBalance = java.math.BigDecimal(rawBalance).divide( val tokenBalance = java.math.BigDecimal(rawBalance).divide(
java.math.BigDecimal("1000000"), // 10^6 for 6 decimals divisor,
6, decimals,
java.math.RoundingMode.DOWN java.math.RoundingMode.DOWN
) )
Result.success(tokenBalance.toPlainString()) Result.success(tokenBalance.toPlainString())
} catch (e: Exception) { } catch (e: Exception) {
android.util.Log.e("TssRepository", "Failed to get Green Points balance: ${e.message}") android.util.Log.e("TssRepository", "Failed to get ERC20 balance for $contractAddress: ${e.message}")
Result.failure(e) Result.failure(e)
} }
} }
} }
/** /**
* Get both KAVA and Green Points balances for an address * Get Green Points (绿积分/dUSDT) token balance for an address
* Uses eth_call to call balanceOf(address) on the ERC-20 contract
*/
suspend fun getGreenPointsBalance(address: String, rpcUrl: String): Result<String> {
return getERC20Balance(
address = address,
rpcUrl = rpcUrl,
contractAddress = GreenPointsToken.CONTRACT_ADDRESS,
decimals = GreenPointsToken.DECIMALS
)
}
/**
* Get Energy Points (积分股/eUSDT) token balance for an address
*/
suspend fun getEnergyPointsBalance(address: String, rpcUrl: String): Result<String> {
return getERC20Balance(
address = address,
rpcUrl = rpcUrl,
contractAddress = EnergyPointsToken.CONTRACT_ADDRESS,
decimals = EnergyPointsToken.DECIMALS
)
}
/**
* Get Future Points (积分值/fUSDT) token balance for an address
*/
suspend fun getFuturePointsBalance(address: String, rpcUrl: String): Result<String> {
return getERC20Balance(
address = address,
rpcUrl = rpcUrl,
contractAddress = FuturePointsToken.CONTRACT_ADDRESS,
decimals = FuturePointsToken.DECIMALS
)
}
/**
* Get all token balances for an address (KAVA + all ERC-20 tokens)
*/ */
suspend fun getWalletBalance(address: String, rpcUrl: String): Result<WalletBalance> { suspend fun getWalletBalance(address: String, rpcUrl: String): Result<WalletBalance> {
return withContext(Dispatchers.IO) { return withContext(Dispatchers.IO) {
try { try {
// Fetch both balances in parallel // Fetch all balances in parallel
val kavaDeferred = async { getBalance(address, rpcUrl) } val kavaDeferred = async { getBalance(address, rpcUrl) }
val greenPointsDeferred = async { getGreenPointsBalance(address, rpcUrl) } val greenPointsDeferred = async { getGreenPointsBalance(address, rpcUrl) }
val energyPointsDeferred = async { getEnergyPointsBalance(address, rpcUrl) }
val futurePointsDeferred = async { getFuturePointsBalance(address, rpcUrl) }
val kavaResult = kavaDeferred.await() val kavaResult = kavaDeferred.await()
val greenPointsResult = greenPointsDeferred.await() val greenPointsResult = greenPointsDeferred.await()
val energyPointsResult = energyPointsDeferred.await()
val futurePointsResult = futurePointsDeferred.await()
val kavaBalance = kavaResult.getOrDefault("0") val kavaBalance = kavaResult.getOrDefault("0")
val greenPointsBalance = greenPointsResult.getOrDefault("0") val greenPointsBalance = greenPointsResult.getOrDefault("0")
val energyPointsBalance = energyPointsResult.getOrDefault("0")
val futurePointsBalance = futurePointsResult.getOrDefault("0")
Result.success(WalletBalance( Result.success(WalletBalance(
address = address, address = address,
kavaBalance = kavaBalance, kavaBalance = kavaBalance,
greenPointsBalance = greenPointsBalance greenPointsBalance = greenPointsBalance,
energyPointsBalance = energyPointsBalance,
futurePointsBalance = futurePointsBalance
)) ))
} catch (e: Exception) { } catch (e: Exception) {
Result.failure(e) Result.failure(e)
@ -2312,8 +2408,12 @@ class TssRepository @Inject constructor(
val shareEntity = shareRecordDao.getShareById(shareId) val shareEntity = shareRecordDao.getShareById(shareId)
?: return@withContext Result.failure(Exception("Share not found")) ?: return@withContext Result.failure(Exception("Share not found"))
// CRITICAL: Use shareEntity.partyId (original partyId from keygen) for signing
// This is required for backup/restore to work - the partyId must match what was used during keygen
val signingPartyId = shareEntity.partyId
currentSigningPartyId = signingPartyId // Save for waitForSignature
android.util.Log.d("TssRepository", "[CO-SIGN] startSigning: participants=${session.participants.size}") android.util.Log.d("TssRepository", "[CO-SIGN] startSigning: participants=${session.participants.size}")
android.util.Log.d("TssRepository", "[CO-SIGN] startSigning: sessionId=$sessionId, partyId=$partyId, partyIndex=${shareEntity.partyIndex}") android.util.Log.d("TssRepository", "[CO-SIGN] startSigning: sessionId=$sessionId, signingPartyId=$signingPartyId, partyIndex=${shareEntity.partyIndex}")
android.util.Log.d("TssRepository", "[CO-SIGN] startSigning: thresholdT=${session.thresholdT}, thresholdN=${shareEntity.thresholdN}") android.util.Log.d("TssRepository", "[CO-SIGN] startSigning: thresholdT=${session.thresholdT}, thresholdN=${shareEntity.thresholdN}")
android.util.Log.d("TssRepository", "[CO-SIGN] startSigning: messageHash=${session.messageHash?.take(20)}...") android.util.Log.d("TssRepository", "[CO-SIGN] startSigning: messageHash=${session.messageHash?.take(20)}...")
session.participants.forEachIndexed { idx, p -> session.participants.forEachIndexed { idx, p ->
@ -2328,10 +2428,10 @@ class TssRepository @Inject constructor(
} else { } else {
rawMessageHash rawMessageHash
} }
android.util.Log.d("TssRepository", "[CO-SIGN] Calling tssNativeBridge.startSign with cleanMessageHash=${cleanMessageHash.take(20)}...") android.util.Log.d("TssRepository", "[CO-SIGN] Calling tssNativeBridge.startSign with cleanMessageHash=${cleanMessageHash.take(20)}..., signingPartyId=$signingPartyId")
val startResult = tssNativeBridge.startSign( val startResult = tssNativeBridge.startSign(
sessionId = sessionId, sessionId = sessionId,
partyId = partyId, partyId = signingPartyId,
partyIndex = shareEntity.partyIndex, partyIndex = shareEntity.partyIndex,
thresholdT = session.thresholdT, thresholdT = session.thresholdT,
thresholdN = shareEntity.thresholdN, thresholdN = shareEntity.thresholdN,
@ -2359,8 +2459,8 @@ class TssRepository @Inject constructor(
startMessageRouting(sessionId, shareEntity.partyIndex) startMessageRouting(sessionId, shareEntity.partyIndex)
} }
// Mark ready // Mark ready - use signingPartyId (original partyId from keygen)
grpcClient.markPartyReady(sessionId, partyId) grpcClient.markPartyReady(sessionId, signingPartyId)
Result.success(Unit) Result.success(Unit)
} catch (e: Exception) { } catch (e: Exception) {
@ -2386,16 +2486,18 @@ class TssRepository @Inject constructor(
val result = signResult.getOrThrow() val result = signResult.getOrThrow()
// Report completion // Report completion - use currentSigningPartyId (original partyId from keygen)
val signatureBytes = Base64.decode(result.signature, Base64.NO_WRAP) val signatureBytes = Base64.decode(result.signature, Base64.NO_WRAP)
val session = _currentSession.value val session = _currentSession.value
val signingPartyId = currentSigningPartyId ?: partyId
if (session != null) { if (session != null) {
grpcClient.reportCompletion(session.sessionId, partyId, signature = signatureBytes) grpcClient.reportCompletion(session.sessionId, signingPartyId, signature = signatureBytes)
} }
stopProgressCollection() stopProgressCollection()
_sessionStatus.value = SessionStatus.COMPLETED _sessionStatus.value = SessionStatus.COMPLETED
messageCollectionJob?.cancel() messageCollectionJob?.cancel()
currentSigningPartyId = null // Clear after signing completes
Result.success(result) Result.success(result)
} catch (e: Exception) { } catch (e: Exception) {
@ -2759,6 +2861,7 @@ private fun ShareRecordEntity.toShareRecord() = ShareRecord(
thresholdT = thresholdT, thresholdT = thresholdT,
thresholdN = thresholdN, thresholdN = thresholdN,
partyIndex = partyIndex, partyIndex = partyIndex,
partyId = partyId,
address = address, address = address,
createdAt = createdAt createdAt = createdAt
) )

View File

@ -6,6 +6,7 @@ import androidx.room.migration.Migration
import androidx.sqlite.db.SupportSQLiteDatabase import androidx.sqlite.db.SupportSQLiteDatabase
import com.durian.tssparty.data.local.AppSettingDao import com.durian.tssparty.data.local.AppSettingDao
import com.durian.tssparty.data.local.ShareRecordDao import com.durian.tssparty.data.local.ShareRecordDao
import com.durian.tssparty.data.local.TransactionRecordDao
import com.durian.tssparty.data.local.TssDatabase import com.durian.tssparty.data.local.TssDatabase
import com.durian.tssparty.data.local.TssNativeBridge import com.durian.tssparty.data.local.TssNativeBridge
import com.durian.tssparty.data.remote.GrpcClient import com.durian.tssparty.data.remote.GrpcClient
@ -34,6 +35,53 @@ object AppModule {
} }
} }
// Migration from version 2 to 3: add party_id column to share_records
// This is critical for backup/restore - the partyId must be preserved for signing to work
private val MIGRATION_2_3 = object : Migration(2, 3) {
override fun migrate(database: SupportSQLiteDatabase) {
// Add party_id column with empty default (existing records will need to be re-exported)
database.execSQL(
"ALTER TABLE `share_records` ADD COLUMN `party_id` TEXT NOT NULL DEFAULT ''"
)
}
}
// Migration from version 3 to 4: add transaction_records table for transfer history
// 添加转账记录表,用于存储交易历史和分类账
private val MIGRATION_3_4 = object : Migration(3, 4) {
override fun migrate(database: SupportSQLiteDatabase) {
// 创建转账记录表
database.execSQL("""
CREATE TABLE IF NOT EXISTS `transaction_records` (
`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
`share_id` INTEGER NOT NULL,
`from_address` TEXT NOT NULL,
`to_address` TEXT NOT NULL,
`amount` TEXT NOT NULL,
`token_type` TEXT NOT NULL,
`tx_hash` TEXT NOT NULL,
`gas_price` TEXT NOT NULL,
`gas_used` TEXT NOT NULL DEFAULT '',
`tx_fee` TEXT NOT NULL DEFAULT '',
`status` TEXT NOT NULL,
`direction` TEXT NOT NULL,
`note` TEXT NOT NULL DEFAULT '',
`created_at` INTEGER NOT NULL,
`confirmed_at` INTEGER,
`block_number` INTEGER,
FOREIGN KEY(`share_id`) REFERENCES `share_records`(`id`) ON DELETE CASCADE
)
""".trimIndent())
// 创建索引以优化查询性能
database.execSQL("CREATE INDEX IF NOT EXISTS `index_transaction_records_share_id` ON `transaction_records` (`share_id`)")
database.execSQL("CREATE UNIQUE INDEX IF NOT EXISTS `index_transaction_records_tx_hash` ON `transaction_records` (`tx_hash`)")
database.execSQL("CREATE INDEX IF NOT EXISTS `index_transaction_records_from_address` ON `transaction_records` (`from_address`)")
database.execSQL("CREATE INDEX IF NOT EXISTS `index_transaction_records_to_address` ON `transaction_records` (`to_address`)")
database.execSQL("CREATE INDEX IF NOT EXISTS `index_transaction_records_created_at` ON `transaction_records` (`created_at`)")
}
}
@Provides @Provides
@Singleton @Singleton
fun provideGson(): Gson { fun provideGson(): Gson {
@ -48,7 +96,7 @@ object AppModule {
TssDatabase::class.java, TssDatabase::class.java,
"tss_party.db" "tss_party.db"
) )
.addMigrations(MIGRATION_1_2) .addMigrations(MIGRATION_1_2, MIGRATION_2_3, MIGRATION_3_4)
.build() .build()
} }
@ -64,6 +112,12 @@ object AppModule {
return database.appSettingDao() return database.appSettingDao()
} }
@Provides
@Singleton
fun provideTransactionRecordDao(database: TssDatabase): TransactionRecordDao {
return database.transactionRecordDao()
}
@Provides @Provides
@Singleton @Singleton
fun provideGrpcClient(): GrpcClient { fun provideGrpcClient(): GrpcClient {
@ -82,8 +136,9 @@ object AppModule {
grpcClient: GrpcClient, grpcClient: GrpcClient,
tssNativeBridge: TssNativeBridge, tssNativeBridge: TssNativeBridge,
shareRecordDao: ShareRecordDao, shareRecordDao: ShareRecordDao,
appSettingDao: AppSettingDao appSettingDao: AppSettingDao,
transactionRecordDao: TransactionRecordDao
): TssRepository { ): TssRepository {
return TssRepository(grpcClient, tssNativeBridge, shareRecordDao, appSettingDao) return TssRepository(grpcClient, tssNativeBridge, shareRecordDao, appSettingDao, transactionRecordDao)
} }
} }

View File

@ -86,6 +86,7 @@ data class ShareRecord(
val thresholdT: Int, val thresholdT: Int,
val thresholdN: Int, val thresholdN: Int,
val partyIndex: Int, val partyIndex: Int,
val partyId: String, // The original partyId used during keygen - required for signing
val address: String, val address: String,
val createdAt: Long = System.currentTimeMillis() val createdAt: Long = System.currentTimeMillis()
) )
@ -129,7 +130,21 @@ enum class NetworkType {
*/ */
enum class TokenType { enum class TokenType {
KAVA, // Native KAVA token KAVA, // Native KAVA token
GREEN_POINTS // 绿积分 (dUSDT) ERC-20 token GREEN_POINTS, // 绿积分 (dUSDT) ERC-20 token
ENERGY_POINTS, // 积分股 (eUSDT) ERC-20 token
FUTURE_POINTS // 积分值 (fUSDT) ERC-20 token
}
/**
* ERC-20 通用函数签名keccak256 哈希的前4字节
* Common ERC-20 function selectors
*/
object ERC20Selectors {
const val BALANCE_OF = "0x70a08231" // balanceOf(address)
const val TRANSFER = "0xa9059cbb" // transfer(address,uint256)
const val APPROVE = "0x095ea7b3" // approve(address,uint256)
const val ALLOWANCE = "0xdd62ed3e" // allowance(address,address)
const val TOTAL_SUPPLY = "0x18160ddd" // totalSupply()
} }
/** /**
@ -142,22 +157,122 @@ object GreenPointsToken {
const val SYMBOL = "dUSDT" const val SYMBOL = "dUSDT"
const val DECIMALS = 6 const val DECIMALS = 6
// ERC-20 function signatures (first 4 bytes of keccak256 hash) // ERC-20 function signatures (kept for backward compatibility)
const val BALANCE_OF_SELECTOR = "0x70a08231" // balanceOf(address) const val BALANCE_OF_SELECTOR = ERC20Selectors.BALANCE_OF
const val TRANSFER_SELECTOR = "0xa9059cbb" // transfer(address,uint256) const val TRANSFER_SELECTOR = ERC20Selectors.TRANSFER
const val APPROVE_SELECTOR = "0x095ea7b3" // approve(address,uint256) const val APPROVE_SELECTOR = ERC20Selectors.APPROVE
const val ALLOWANCE_SELECTOR = "0xdd62ed3e" // allowance(address,address) const val ALLOWANCE_SELECTOR = ERC20Selectors.ALLOWANCE
const val TOTAL_SUPPLY_SELECTOR = "0x18160ddd" // totalSupply() const val TOTAL_SUPPLY_SELECTOR = ERC20Selectors.TOTAL_SUPPLY
} }
/** /**
* Wallet balance containing both native and token balances * Energy Points (积分股) Token Contract Configuration
* eUSDT - ERC-20 token on Kava EVM
* 总供应量100.02亿 (10,002,000,000)
*/
object EnergyPointsToken {
const val CONTRACT_ADDRESS = "0x7C3275D808eFbAE90C06C7E3A9AfDdcAa8563931"
const val NAME = "积分股"
const val SYMBOL = "eUSDT"
const val DECIMALS = 6 // 与 dUSDT 相同的精度
}
/**
* Future Points (积分值) Token Contract Configuration
* fUSDT - ERC-20 token on Kava EVM
* 总供应量1万亿 (1,000,000,000,000)
*/
object FuturePointsToken {
const val CONTRACT_ADDRESS = "0x14dc4f7d3E4197438d058C3D156dd9826A161134"
const val NAME = "积分值"
const val SYMBOL = "fUSDT"
const val DECIMALS = 6 // 与 dUSDT 相同的精度
}
/**
* 代币配置工具类
* Token configuration utility
*/
object TokenConfig {
/**
* 获取代币合约地址
*/
fun getContractAddress(tokenType: TokenType): String? {
return when (tokenType) {
TokenType.KAVA -> null // 原生代币无合约地址
TokenType.GREEN_POINTS -> GreenPointsToken.CONTRACT_ADDRESS
TokenType.ENERGY_POINTS -> EnergyPointsToken.CONTRACT_ADDRESS
TokenType.FUTURE_POINTS -> FuturePointsToken.CONTRACT_ADDRESS
}
}
/**
* 获取代币精度
*/
fun getDecimals(tokenType: TokenType): Int {
return when (tokenType) {
TokenType.KAVA -> 18 // KAVA 原生代币精度
TokenType.GREEN_POINTS -> GreenPointsToken.DECIMALS
TokenType.ENERGY_POINTS -> EnergyPointsToken.DECIMALS
TokenType.FUTURE_POINTS -> FuturePointsToken.DECIMALS
}
}
/**
* 获取代币名称
*/
fun getName(tokenType: TokenType): String {
return when (tokenType) {
TokenType.KAVA -> "KAVA"
TokenType.GREEN_POINTS -> GreenPointsToken.NAME
TokenType.ENERGY_POINTS -> EnergyPointsToken.NAME
TokenType.FUTURE_POINTS -> FuturePointsToken.NAME
}
}
/**
* 获取代币符号
*/
fun getSymbol(tokenType: TokenType): String {
return when (tokenType) {
TokenType.KAVA -> "KAVA"
TokenType.GREEN_POINTS -> GreenPointsToken.SYMBOL
TokenType.ENERGY_POINTS -> EnergyPointsToken.SYMBOL
TokenType.FUTURE_POINTS -> FuturePointsToken.SYMBOL
}
}
/**
* 判断是否为 ERC-20 代币
*/
fun isERC20(tokenType: TokenType): Boolean {
return tokenType != TokenType.KAVA
}
}
/**
* Wallet balance containing native and all token balances
* 钱包余额包含原生代币和所有 ERC-20 代币余额
*/ */
data class WalletBalance( data class WalletBalance(
val address: String, val address: String,
val kavaBalance: String = "0", // Native KAVA balance val kavaBalance: String = "0", // Native KAVA balance
val greenPointsBalance: String = "0" // 绿积分 (dUSDT) balance val greenPointsBalance: String = "0", // 绿积分 (dUSDT) balance
) val energyPointsBalance: String = "0", // 积分股 (eUSDT) balance
val futurePointsBalance: String = "0" // 积分值 (fUSDT) balance
) {
/**
* 根据代币类型获取余额
*/
fun getBalance(tokenType: TokenType): String {
return when (tokenType) {
TokenType.KAVA -> kavaBalance
TokenType.GREEN_POINTS -> greenPointsBalance
TokenType.ENERGY_POINTS -> energyPointsBalance
TokenType.FUTURE_POINTS -> futurePointsBalance
}
}
}
/** /**
* Share backup data for export/import * Share backup data for export/import
@ -165,7 +280,7 @@ data class WalletBalance(
*/ */
data class ShareBackup( data class ShareBackup(
@SerializedName("version") @SerializedName("version")
val version: Int = 1, // Backup format version for future compatibility val version: Int = 2, // Version 2: added partyId field for proper backup/restore
@SerializedName("sessionId") @SerializedName("sessionId")
val sessionId: String, val sessionId: String,
@ -185,6 +300,9 @@ data class ShareBackup(
@SerializedName("partyIndex") @SerializedName("partyIndex")
val partyIndex: Int, val partyIndex: Int,
@SerializedName("partyId")
val partyId: String, // The original partyId used during keygen - CRITICAL for signing after restore
@SerializedName("address") @SerializedName("address")
val address: String, val address: String,
@ -209,6 +327,7 @@ data class ShareBackup(
thresholdT = share.thresholdT, thresholdT = share.thresholdT,
thresholdN = share.thresholdN, thresholdN = share.thresholdN,
partyIndex = share.partyIndex, partyIndex = share.partyIndex,
partyId = share.partyId,
address = share.address, address = share.address,
createdAt = share.createdAt createdAt = share.createdAt
) )
@ -227,6 +346,7 @@ data class ShareBackup(
thresholdT = thresholdT, thresholdT = thresholdT,
thresholdN = thresholdN, thresholdN = thresholdN,
partyIndex = partyIndex, partyIndex = partyIndex,
partyId = partyId,
address = address, address = address,
createdAt = createdAt createdAt = createdAt
) )

View File

@ -27,10 +27,13 @@ import android.graphics.Bitmap
import androidx.compose.foundation.Image import androidx.compose.foundation.Image
import androidx.compose.foundation.background import androidx.compose.foundation.background
import androidx.compose.ui.graphics.asImageBitmap import androidx.compose.ui.graphics.asImageBitmap
import com.durian.tssparty.domain.model.EnergyPointsToken
import com.durian.tssparty.domain.model.FuturePointsToken
import com.durian.tssparty.domain.model.GreenPointsToken import com.durian.tssparty.domain.model.GreenPointsToken
import com.durian.tssparty.domain.model.NetworkType import com.durian.tssparty.domain.model.NetworkType
import com.durian.tssparty.domain.model.SessionStatus import com.durian.tssparty.domain.model.SessionStatus
import com.durian.tssparty.domain.model.ShareRecord import com.durian.tssparty.domain.model.ShareRecord
import com.durian.tssparty.domain.model.TokenConfig
import com.durian.tssparty.domain.model.TokenType import com.durian.tssparty.domain.model.TokenType
import com.durian.tssparty.domain.model.WalletBalance import com.durian.tssparty.domain.model.WalletBalance
import com.durian.tssparty.util.TransactionUtils import com.durian.tssparty.util.TransactionUtils
@ -156,10 +159,8 @@ fun TransferScreen(
rpcUrl = rpcUrl, rpcUrl = rpcUrl,
onSubmit = { onSubmit = {
// Get current balance for the selected token type // Get current balance for the selected token type
val currentBalance = when (selectedTokenType) { val currentBalance = walletBalance?.getBalance(selectedTokenType)
TokenType.KAVA -> walletBalance?.kavaBalance ?: balance ?: if (selectedTokenType == TokenType.KAVA) balance else null
TokenType.GREEN_POINTS -> walletBalance?.greenPointsBalance
}
when { when {
toAddress.isBlank() -> validationError = "请输入收款地址" toAddress.isBlank() -> validationError = "请输入收款地址"
!toAddress.startsWith("0x") || toAddress.length != 42 -> validationError = "地址格式不正确" !toAddress.startsWith("0x") || toAddress.length != 42 -> validationError = "地址格式不正确"
@ -257,14 +258,9 @@ private fun TransferInputScreen(
var isCalculatingMax by remember { mutableStateOf(false) } var isCalculatingMax by remember { mutableStateOf(false) }
// Get current balance for the selected token type // Get current balance for the selected token type
val currentBalance = when (selectedTokenType) { val currentBalance = walletBalance?.getBalance(selectedTokenType)
TokenType.KAVA -> walletBalance?.kavaBalance ?: balance ?: if (selectedTokenType == TokenType.KAVA) balance else null
TokenType.GREEN_POINTS -> walletBalance?.greenPointsBalance val tokenSymbol = TokenConfig.getName(selectedTokenType)
}
val tokenSymbol = when (selectedTokenType) {
TokenType.KAVA -> "KAVA"
TokenType.GREEN_POINTS -> GreenPointsToken.NAME
}
Column( Column(
modifier = Modifier modifier = Modifier
@ -293,38 +289,74 @@ private fun TransferInputScreen(
) )
Spacer(modifier = Modifier.height(8.dp)) Spacer(modifier = Modifier.height(8.dp))
// Show both balances // Show all token balances in a 2x2 grid
Row( Column {
modifier = Modifier.fillMaxWidth(), Row(
horizontalArrangement = Arrangement.SpaceBetween modifier = Modifier.fillMaxWidth(),
) { horizontalArrangement = Arrangement.SpaceBetween
// KAVA balance ) {
Column { // KAVA balance
Text( Column {
text = "KAVA", Text(
style = MaterialTheme.typography.labelSmall, text = "KAVA",
color = MaterialTheme.colorScheme.onSurfaceVariant style = MaterialTheme.typography.labelSmall,
) color = MaterialTheme.colorScheme.onSurfaceVariant
Text( )
text = walletBalance?.kavaBalance ?: balance ?: "加载中...", Text(
style = MaterialTheme.typography.bodySmall, text = walletBalance?.kavaBalance ?: balance ?: "加载中...",
fontWeight = FontWeight.Medium, style = MaterialTheme.typography.bodySmall,
color = MaterialTheme.colorScheme.primary fontWeight = FontWeight.Medium,
) color = MaterialTheme.colorScheme.primary
)
}
// Green Points balance (绿积分)
Column(horizontalAlignment = Alignment.End) {
Text(
text = GreenPointsToken.NAME,
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
Text(
text = walletBalance?.greenPointsBalance ?: "加载中...",
style = MaterialTheme.typography.bodySmall,
fontWeight = FontWeight.Medium,
color = Color(0xFF4CAF50)
)
}
} }
// Green Points balance Spacer(modifier = Modifier.height(4.dp))
Column(horizontalAlignment = Alignment.End) { Row(
Text( modifier = Modifier.fillMaxWidth(),
text = GreenPointsToken.NAME, horizontalArrangement = Arrangement.SpaceBetween
style = MaterialTheme.typography.labelSmall, ) {
color = MaterialTheme.colorScheme.onSurfaceVariant // Energy Points balance (积分股)
) Column {
Text( Text(
text = walletBalance?.greenPointsBalance ?: "加载中...", text = EnergyPointsToken.NAME,
style = MaterialTheme.typography.bodySmall, style = MaterialTheme.typography.labelSmall,
fontWeight = FontWeight.Medium, color = MaterialTheme.colorScheme.onSurfaceVariant
color = Color(0xFF4CAF50) )
) Text(
text = walletBalance?.energyPointsBalance ?: "加载中...",
style = MaterialTheme.typography.bodySmall,
fontWeight = FontWeight.Medium,
color = Color(0xFF2196F3) // Blue
)
}
// Future Points balance (积分值)
Column(horizontalAlignment = Alignment.End) {
Text(
text = FuturePointsToken.NAME,
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
Text(
text = walletBalance?.futurePointsBalance ?: "加载中...",
style = MaterialTheme.typography.bodySmall,
fontWeight = FontWeight.Medium,
color = Color(0xFF9C27B0) // Purple
)
}
} }
} }
} }
@ -339,6 +371,7 @@ private fun TransferInputScreen(
color = MaterialTheme.colorScheme.onSurfaceVariant color = MaterialTheme.colorScheme.onSurfaceVariant
) )
Spacer(modifier = Modifier.height(8.dp)) Spacer(modifier = Modifier.height(8.dp))
// First row: KAVA and Green Points
Row( Row(
modifier = Modifier.fillMaxWidth(), modifier = Modifier.fillMaxWidth(),
horizontalArrangement = Arrangement.spacedBy(8.dp) horizontalArrangement = Arrangement.spacedBy(8.dp)
@ -359,7 +392,7 @@ private fun TransferInputScreen(
}, },
modifier = Modifier.weight(1f) modifier = Modifier.weight(1f)
) )
// Green Points option // Green Points option (绿积分)
FilterChip( FilterChip(
selected = selectedTokenType == TokenType.GREEN_POINTS, selected = selectedTokenType == TokenType.GREEN_POINTS,
onClick = { onTokenTypeChange(TokenType.GREEN_POINTS) }, onClick = { onTokenTypeChange(TokenType.GREEN_POINTS) },
@ -380,6 +413,53 @@ private fun TransferInputScreen(
modifier = Modifier.weight(1f) modifier = Modifier.weight(1f)
) )
} }
Spacer(modifier = Modifier.height(8.dp))
// Second row: Energy Points and Future Points
Row(
modifier = Modifier.fillMaxWidth(),
horizontalArrangement = Arrangement.spacedBy(8.dp)
) {
// Energy Points option (积分股)
FilterChip(
selected = selectedTokenType == TokenType.ENERGY_POINTS,
onClick = { onTokenTypeChange(TokenType.ENERGY_POINTS) },
label = { Text(EnergyPointsToken.NAME) },
leadingIcon = {
if (selectedTokenType == TokenType.ENERGY_POINTS) {
Icon(
Icons.Default.Check,
contentDescription = null,
modifier = Modifier.size(18.dp)
)
}
},
colors = FilterChipDefaults.filterChipColors(
selectedContainerColor = Color(0xFF2196F3).copy(alpha = 0.2f),
selectedLabelColor = Color(0xFF2196F3)
),
modifier = Modifier.weight(1f)
)
// Future Points option (积分值)
FilterChip(
selected = selectedTokenType == TokenType.FUTURE_POINTS,
onClick = { onTokenTypeChange(TokenType.FUTURE_POINTS) },
label = { Text(FuturePointsToken.NAME) },
leadingIcon = {
if (selectedTokenType == TokenType.FUTURE_POINTS) {
Icon(
Icons.Default.Check,
contentDescription = null,
modifier = Modifier.size(18.dp)
)
}
},
colors = FilterChipDefaults.filterChipColors(
selectedContainerColor = Color(0xFF9C27B0).copy(alpha = 0.2f),
selectedLabelColor = Color(0xFF9C27B0)
),
modifier = Modifier.weight(1f)
)
}
Spacer(modifier = Modifier.height(16.dp)) Spacer(modifier = Modifier.height(16.dp))
@ -418,9 +498,14 @@ private fun TransferInputScreen(
keyboardOptions = KeyboardOptions(keyboardType = KeyboardType.Decimal), keyboardOptions = KeyboardOptions(keyboardType = KeyboardType.Decimal),
leadingIcon = { leadingIcon = {
Icon( Icon(
if (selectedTokenType == TokenType.GREEN_POINTS) Icons.Default.Stars else Icons.Default.AttachMoney, if (selectedTokenType == TokenType.KAVA) Icons.Default.AttachMoney else Icons.Default.Stars,
contentDescription = null, contentDescription = null,
tint = if (selectedTokenType == TokenType.GREEN_POINTS) Color(0xFF4CAF50) else MaterialTheme.colorScheme.onSurfaceVariant tint = when (selectedTokenType) {
TokenType.KAVA -> MaterialTheme.colorScheme.onSurfaceVariant
TokenType.GREEN_POINTS -> Color(0xFF4CAF50)
TokenType.ENERGY_POINTS -> Color(0xFF2196F3)
TokenType.FUTURE_POINTS -> Color(0xFF9C27B0)
}
) )
}, },
trailingIcon = { trailingIcon = {
@ -439,7 +524,7 @@ private fun TransferInputScreen(
onAmountChange(currentBalance) onAmountChange(currentBalance)
} }
} else { } else {
// For tokens, use the full balance // For ERC-20 tokens (dUSDT, eUSDT, fUSDT), use the full balance
onAmountChange(currentBalance) onAmountChange(currentBalance)
} }
isCalculatingMax = false isCalculatingMax = false

View File

@ -35,6 +35,8 @@ import androidx.compose.ui.unit.sp
import androidx.compose.ui.window.Dialog import androidx.compose.ui.window.Dialog
import android.content.Intent import android.content.Intent
import android.net.Uri import android.net.Uri
import com.durian.tssparty.domain.model.EnergyPointsToken
import com.durian.tssparty.domain.model.FuturePointsToken
import com.durian.tssparty.domain.model.GreenPointsToken import com.durian.tssparty.domain.model.GreenPointsToken
import com.durian.tssparty.domain.model.NetworkType import com.durian.tssparty.domain.model.NetworkType
import com.durian.tssparty.domain.model.ShareRecord import com.durian.tssparty.domain.model.ShareRecord
@ -281,62 +283,123 @@ private fun WalletItemCard(
Spacer(modifier = Modifier.height(12.dp)) Spacer(modifier = Modifier.height(12.dp))
// Balance display - now shows both KAVA and Green Points // Balance display - shows all token balances in a 2x2 grid
Row( Column {
modifier = Modifier.fillMaxWidth(), Row(
horizontalArrangement = Arrangement.SpaceBetween modifier = Modifier.fillMaxWidth(),
) { horizontalArrangement = Arrangement.SpaceBetween
// KAVA balance ) {
Column { // KAVA balance
Text( Column {
text = "KAVA",
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.outline
)
Row(verticalAlignment = Alignment.CenterVertically) {
Icon(
Icons.Default.AccountBalance,
contentDescription = null,
modifier = Modifier.size(16.dp),
tint = MaterialTheme.colorScheme.primary
)
Spacer(modifier = Modifier.width(4.dp))
Text( Text(
text = walletBalance?.kavaBalance ?: balance ?: "加载中...", text = "KAVA",
style = MaterialTheme.typography.bodyMedium, style = MaterialTheme.typography.labelSmall,
color = if (walletBalance != null || balance != null) color = MaterialTheme.colorScheme.outline
MaterialTheme.colorScheme.primary
else
MaterialTheme.colorScheme.outline,
fontWeight = FontWeight.Medium
) )
Row(verticalAlignment = Alignment.CenterVertically) {
Icon(
Icons.Default.AccountBalance,
contentDescription = null,
modifier = Modifier.size(16.dp),
tint = MaterialTheme.colorScheme.primary
)
Spacer(modifier = Modifier.width(4.dp))
Text(
text = walletBalance?.kavaBalance ?: balance ?: "加载中...",
style = MaterialTheme.typography.bodyMedium,
color = if (walletBalance != null || balance != null)
MaterialTheme.colorScheme.primary
else
MaterialTheme.colorScheme.outline,
fontWeight = FontWeight.Medium
)
}
}
// Green Points (绿积分) balance
Column(horizontalAlignment = Alignment.End) {
Text(
text = GreenPointsToken.NAME,
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.outline
)
Row(verticalAlignment = Alignment.CenterVertically) {
Icon(
Icons.Default.Stars,
contentDescription = null,
modifier = Modifier.size(16.dp),
tint = Color(0xFF4CAF50)
)
Spacer(modifier = Modifier.width(4.dp))
Text(
text = walletBalance?.greenPointsBalance ?: "加载中...",
style = MaterialTheme.typography.bodyMedium,
color = if (walletBalance != null)
Color(0xFF4CAF50)
else
MaterialTheme.colorScheme.outline,
fontWeight = FontWeight.Medium
)
}
} }
} }
Spacer(modifier = Modifier.height(8.dp))
// Green Points (绿积分) balance Row(
Column(horizontalAlignment = Alignment.End) { modifier = Modifier.fillMaxWidth(),
Text( horizontalArrangement = Arrangement.SpaceBetween
text = GreenPointsToken.NAME, ) {
style = MaterialTheme.typography.labelSmall, // Energy Points (积分股) balance
color = MaterialTheme.colorScheme.outline Column {
)
Row(verticalAlignment = Alignment.CenterVertically) {
Icon(
Icons.Default.Stars,
contentDescription = null,
modifier = Modifier.size(16.dp),
tint = Color(0xFF4CAF50) // Green color for Green Points
)
Spacer(modifier = Modifier.width(4.dp))
Text( Text(
text = walletBalance?.greenPointsBalance ?: "加载中...", text = EnergyPointsToken.NAME,
style = MaterialTheme.typography.bodyMedium, style = MaterialTheme.typography.labelSmall,
color = if (walletBalance != null) color = MaterialTheme.colorScheme.outline
Color(0xFF4CAF50)
else
MaterialTheme.colorScheme.outline,
fontWeight = FontWeight.Medium
) )
Row(verticalAlignment = Alignment.CenterVertically) {
Icon(
Icons.Default.Stars,
contentDescription = null,
modifier = Modifier.size(16.dp),
tint = Color(0xFF2196F3) // Blue
)
Spacer(modifier = Modifier.width(4.dp))
Text(
text = walletBalance?.energyPointsBalance ?: "加载中...",
style = MaterialTheme.typography.bodyMedium,
color = if (walletBalance != null)
Color(0xFF2196F3)
else
MaterialTheme.colorScheme.outline,
fontWeight = FontWeight.Medium
)
}
}
// Future Points (积分值) balance
Column(horizontalAlignment = Alignment.End) {
Text(
text = FuturePointsToken.NAME,
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.outline
)
Row(verticalAlignment = Alignment.CenterVertically) {
Icon(
Icons.Default.Stars,
contentDescription = null,
modifier = Modifier.size(16.dp),
tint = Color(0xFF9C27B0) // Purple
)
Spacer(modifier = Modifier.width(4.dp))
Text(
text = walletBalance?.futurePointsBalance ?: "加载中...",
style = MaterialTheme.typography.bodyMedium,
color = if (walletBalance != null)
Color(0xFF9C27B0)
else
MaterialTheme.colorScheme.outline,
fontWeight = FontWeight.Medium
)
}
} }
} }
} }

View File

@ -1,6 +1,10 @@
package com.durian.tssparty.util package com.durian.tssparty.util
import com.durian.tssparty.domain.model.ERC20Selectors
import com.durian.tssparty.domain.model.EnergyPointsToken
import com.durian.tssparty.domain.model.FuturePointsToken
import com.durian.tssparty.domain.model.GreenPointsToken import com.durian.tssparty.domain.model.GreenPointsToken
import com.durian.tssparty.domain.model.TokenConfig
import com.durian.tssparty.domain.model.TokenType import com.durian.tssparty.domain.model.TokenType
import kotlinx.coroutines.Dispatchers import kotlinx.coroutines.Dispatchers
import kotlinx.coroutines.withContext import kotlinx.coroutines.withContext
@ -61,7 +65,7 @@ object TransactionUtils {
/** /**
* Prepare a transaction for signing * Prepare a transaction for signing
* Gets nonce, gas price, estimates gas, and calculates sign hash * Gets nonce, gas price, estimates gas, and calculates sign hash
* Supports both native KAVA transfers and ERC-20 token transfers (绿积分) * Supports both native KAVA transfers and ERC-20 token transfers (绿积分/积分股/积分值)
*/ */
suspend fun prepareTransaction(params: TransactionParams): Result<PreparedTransaction> = withContext(Dispatchers.IO) { suspend fun prepareTransaction(params: TransactionParams): Result<PreparedTransaction> = withContext(Dispatchers.IO) {
try { try {
@ -77,13 +81,16 @@ object TransactionUtils {
// Native KAVA transfer // Native KAVA transfer
Triple(params.to, kavaToWei(params.amount), ByteArray(0)) Triple(params.to, kavaToWei(params.amount), ByteArray(0))
} }
TokenType.GREEN_POINTS -> { TokenType.GREEN_POINTS, TokenType.ENERGY_POINTS, TokenType.FUTURE_POINTS -> {
// ERC-20 token transfer (绿积分) // ERC-20 token transfer
// To address is the contract, value is 0 // To address is the contract, value is 0
// Data is transfer(recipient, amount) encoded // Data is transfer(recipient, amount) encoded
val tokenAmount = greenPointsToRaw(params.amount) val contractAddress = TokenConfig.getContractAddress(params.tokenType)
?: return@withContext Result.failure(Exception("Invalid token type"))
val decimals = TokenConfig.getDecimals(params.tokenType)
val tokenAmount = tokenToRaw(params.amount, decimals)
val transferData = encodeErc20Transfer(params.to, tokenAmount) val transferData = encodeErc20Transfer(params.to, tokenAmount)
Triple(GreenPointsToken.CONTRACT_ADDRESS, BigInteger.ZERO, transferData) Triple(contractAddress, BigInteger.ZERO, transferData)
} }
} }
@ -98,7 +105,7 @@ object TransactionUtils {
// Default gas limits // Default gas limits
when (params.tokenType) { when (params.tokenType) {
TokenType.KAVA -> BigInteger.valueOf(21000) TokenType.KAVA -> BigInteger.valueOf(21000)
TokenType.GREEN_POINTS -> BigInteger.valueOf(65000) // ERC-20 transfers need more gas else -> BigInteger.valueOf(65000) // ERC-20 transfers need more gas
} }
} }
@ -139,7 +146,7 @@ object TransactionUtils {
*/ */
private fun encodeErc20Transfer(to: String, amount: BigInteger): ByteArray { private fun encodeErc20Transfer(to: String, amount: BigInteger): ByteArray {
// Function selector: transfer(address,uint256) = 0xa9059cbb // Function selector: transfer(address,uint256) = 0xa9059cbb
val selector = GreenPointsToken.TRANSFER_SELECTOR.removePrefix("0x").hexToByteArray() val selector = ERC20Selectors.TRANSFER.removePrefix("0x").hexToByteArray()
// Encode recipient address (padded to 32 bytes) // Encode recipient address (padded to 32 bytes)
val paddedAddress = to.removePrefix("0x").lowercase().padStart(64, '0').hexToByteArray() val paddedAddress = to.removePrefix("0x").lowercase().padStart(64, '0').hexToByteArray()
@ -152,21 +159,43 @@ object TransactionUtils {
} }
/** /**
* Convert Green Points amount to raw units (6 decimals) * Convert token amount to raw units based on decimals
* @param amount Human-readable amount (e.g., "100.5")
* @param decimals Token decimals (e.g., 6 for USDT-like tokens, 18 for native)
*/ */
fun greenPointsToRaw(amount: String): BigInteger { fun tokenToRaw(amount: String, decimals: Int): BigInteger {
val decimal = BigDecimal(amount) val decimal = BigDecimal(amount)
val rawDecimal = decimal.multiply(BigDecimal("1000000")) // 10^6 val multiplier = BigDecimal.TEN.pow(decimals)
val rawDecimal = decimal.multiply(multiplier)
return rawDecimal.toBigInteger() return rawDecimal.toBigInteger()
} }
/**
* Convert raw units to human-readable token amount
* @param raw Raw amount in smallest units
* @param decimals Token decimals (e.g., 6 for USDT-like tokens, 18 for native)
*/
fun rawToToken(raw: BigInteger, decimals: Int): String {
val rawDecimal = BigDecimal(raw)
val divisor = BigDecimal.TEN.pow(decimals)
val displayDecimal = rawDecimal.divide(divisor, decimals, java.math.RoundingMode.DOWN)
return displayDecimal.toPlainString()
}
/**
* Convert Green Points amount to raw units (6 decimals)
* @deprecated Use tokenToRaw(amount, 6) instead
*/
fun greenPointsToRaw(amount: String): BigInteger {
return tokenToRaw(amount, GreenPointsToken.DECIMALS)
}
/** /**
* Convert raw units to Green Points display amount * Convert raw units to Green Points display amount
* @deprecated Use rawToToken(raw, 6) instead
*/ */
fun rawToGreenPoints(raw: BigInteger): String { fun rawToGreenPoints(raw: BigInteger): String {
val rawDecimal = BigDecimal(raw) return rawToToken(raw, GreenPointsToken.DECIMALS)
val displayDecimal = rawDecimal.divide(BigDecimal("1000000"), 6, java.math.RoundingMode.DOWN)
return displayDecimal.toPlainString()
} }
/** /**

View File

@ -821,6 +821,21 @@ async function handleCoSignStart(event: {
// 标记签名开始 // 标记签名开始
signInProgressSessionId = event.sessionId; signInProgressSessionId = event.sessionId;
// CRITICAL: Get the original partyId from keygen (stored in share) for signing
// This is essential for backup/restore - the partyId must match what was used during keygen
const share = database?.getShare(activeCoSignSession.shareId, activeCoSignSession.sharePassword);
if (!share) {
debugLog.error('main', 'Failed to get share data');
mainWindow?.webContents.send(`cosign:events:${event.sessionId}`, {
type: 'failed',
error: 'Failed to get share data',
});
signInProgressSessionId = null;
return;
}
const signingPartyId = share.party_id || grpcClient?.getPartyId() || '';
debugLog.info('main', `Using signingPartyId=${signingPartyId} (currentDevicePartyId=${grpcClient?.getPartyId()})`);
// 打印当前 activeCoSignSession.participants 状态 // 打印当前 activeCoSignSession.participants 状态
console.log('[CO-SIGN] Current activeCoSignSession.participants before update:', console.log('[CO-SIGN] Current activeCoSignSession.participants before update:',
activeCoSignSession.participants.map(p => ({ activeCoSignSession.participants.map(p => ({
@ -832,8 +847,9 @@ async function handleCoSignStart(event: {
// 从 event.selectedParties 更新参与者列表 // 从 event.selectedParties 更新参与者列表
// 优先使用 activeCoSignSession.participants 中的 partyIndex来自 signingParties 或 other_parties // 优先使用 activeCoSignSession.participants 中的 partyIndex来自 signingParties 或 other_parties
// CRITICAL: Use signingPartyId (original from keygen) for identification
if (event.selectedParties && event.selectedParties.length > 0) { if (event.selectedParties && event.selectedParties.length > 0) {
const myPartyId = grpcClient?.getPartyId(); const myPartyId = signingPartyId;
const updatedParticipants: Array<{ partyId: string; partyIndex: number; name: string }> = []; const updatedParticipants: Array<{ partyId: string; partyIndex: number; name: string }> = [];
event.selectedParties.forEach((partyId) => { event.selectedParties.forEach((partyId) => {
@ -869,21 +885,11 @@ async function handleCoSignStart(event: {
}))); })));
} }
// 获取 share 数据 // Note: share already fetched above for getting signingPartyId
const share = database?.getShare(activeCoSignSession.shareId, activeCoSignSession.sharePassword);
if (!share) {
debugLog.error('main', 'Failed to get share data');
mainWindow?.webContents.send(`cosign:events:${event.sessionId}`, {
type: 'failed',
error: 'Failed to get share data',
});
signInProgressSessionId = null;
return;
}
console.log('[CO-SIGN] Calling tssHandler.participateSign with:', { console.log('[CO-SIGN] Calling tssHandler.participateSign with:', {
sessionId: activeCoSignSession.sessionId, sessionId: activeCoSignSession.sessionId,
partyId: grpcClient?.getPartyId(), partyId: signingPartyId, // CRITICAL: Use signingPartyId (original from keygen)
partyIndex: activeCoSignSession.partyIndex, partyIndex: activeCoSignSession.partyIndex,
participants: activeCoSignSession.participants.map(p => ({ partyId: p.partyId.substring(0, 8), partyIndex: p.partyIndex })), participants: activeCoSignSession.participants.map(p => ({ partyId: p.partyId.substring(0, 8), partyIndex: p.partyIndex })),
threshold: activeCoSignSession.threshold, threshold: activeCoSignSession.threshold,
@ -892,9 +898,10 @@ async function handleCoSignStart(event: {
debugLog.info('tss', `Starting sign for session ${event.sessionId}...`); debugLog.info('tss', `Starting sign for session ${event.sessionId}...`);
try { try {
// CRITICAL: Use signingPartyId (original partyId from keygen) for signing
const result = await (tssHandler as TSSHandler).participateSign( const result = await (tssHandler as TSSHandler).participateSign(
activeCoSignSession.sessionId, activeCoSignSession.sessionId,
grpcClient?.getPartyId() || '', signingPartyId, // CRITICAL: Use original partyId from keygen for backup/restore to work
activeCoSignSession.partyIndex, activeCoSignSession.partyIndex,
activeCoSignSession.participants, activeCoSignSession.participants,
activeCoSignSession.threshold, activeCoSignSession.threshold,
@ -1613,9 +1620,9 @@ function setupIpcHandlers() {
initiatorName?: string; initiatorName?: string;
}) => { }) => {
try { try {
// 获取当前 party ID // 获取当前 party ID (用于检查连接状态)
const partyId = grpcClient?.getPartyId(); const currentDevicePartyId = grpcClient?.getPartyId();
if (!partyId) { if (!currentDevicePartyId) {
return { success: false, error: '请先连接到消息路由器' }; return { success: false, error: '请先连接到消息路由器' };
} }
@ -1625,6 +1632,11 @@ function setupIpcHandlers() {
return { success: false, error: 'Share 不存在或密码错误' }; return { success: false, error: 'Share 不存在或密码错误' };
} }
// CRITICAL: Use the original partyId from keygen (stored in share) for signing
// This is essential for backup/restore - the partyId must match what was used during keygen
const partyId = share.party_id || currentDevicePartyId;
debugLog.info('main', `Initiator using partyId=${partyId} (currentDevicePartyId=${currentDevicePartyId})`);
// 从后端获取 keygen 会话的参与者信息(包含正确的 party_index // 从后端获取 keygen 会话的参与者信息(包含正确的 party_index
const keygenStatus = await accountClient?.getSessionStatus(share.session_id); const keygenStatus = await accountClient?.getSessionStatus(share.session_id);
if (!keygenStatus?.participants || keygenStatus.participants.length === 0) { if (!keygenStatus?.participants || keygenStatus.participants.length === 0) {
@ -1810,8 +1822,8 @@ function setupIpcHandlers() {
parties?: Array<{ party_id: string; party_index: number }>; parties?: Array<{ party_id: string; party_index: number }>;
}) => { }) => {
try { try {
const partyId = grpcClient?.getPartyId(); const currentDevicePartyId = grpcClient?.getPartyId();
if (!partyId) { if (!currentDevicePartyId) {
return { success: false, error: '请先连接到消息路由器' }; return { success: false, error: '请先连接到消息路由器' };
} }
@ -1821,9 +1833,12 @@ function setupIpcHandlers() {
return { success: false, error: 'Share 不存在或密码错误' }; return { success: false, error: 'Share 不存在或密码错误' };
} }
debugLog.info('grpc', `Joining co-sign session: sessionId=${params.sessionId}, partyId=${partyId}`); // CRITICAL: Use the original partyId from keygen (stored in share) for signing
// This is essential for backup/restore - the partyId must match what was used during keygen
const signingPartyId = share.party_id || currentDevicePartyId;
debugLog.info('grpc', `Joining co-sign session: sessionId=${params.sessionId}, signingPartyId=${signingPartyId} (currentDevicePartyId=${currentDevicePartyId})`);
const result = await grpcClient?.joinSession(params.sessionId, partyId, params.joinToken); const result = await grpcClient?.joinSession(params.sessionId, signingPartyId, params.joinToken);
if (result?.success) { if (result?.success) {
// 设置活跃的 Co-Sign 会话 // 设置活跃的 Co-Sign 会话
// 优先使用 params.parties来自 validateInviteCode包含所有预期参与者 // 优先使用 params.parties来自 validateInviteCode包含所有预期参与者
@ -1832,10 +1847,11 @@ function setupIpcHandlers() {
if (params.parties && params.parties.length > 0) { if (params.parties && params.parties.length > 0) {
// 使用完整的 parties 列表 // 使用完整的 parties 列表
// CRITICAL: Use signingPartyId (original from keygen) for identification
participants = params.parties.map(p => ({ participants = params.parties.map(p => ({
partyId: p.party_id, partyId: p.party_id,
partyIndex: p.party_index, partyIndex: p.party_index,
name: p.party_id === partyId ? '我' : `参与方 ${p.party_index + 1}`, name: p.party_id === signingPartyId ? '我' : `参与方 ${p.party_index + 1}`,
})); }));
console.log('[CO-SIGN] Participant using params.parties (complete list):', participants.map(p => ({ console.log('[CO-SIGN] Participant using params.parties (complete list):', participants.map(p => ({
partyId: p.partyId.substring(0, 8), partyId: p.partyId.substring(0, 8),
@ -1850,9 +1866,9 @@ function setupIpcHandlers() {
name: `参与方 ${idx + 1}`, name: `参与方 ${idx + 1}`,
})) || []; })) || [];
// 添加自己 // 添加自己 - CRITICAL: Use signingPartyId (original from keygen)
participants.push({ participants.push({
partyId: partyId, partyId: signingPartyId,
partyIndex: result.party_index, partyIndex: result.party_index,
name: '我', name: '我',
}); });
@ -1886,11 +1902,11 @@ function setupIpcHandlers() {
messageHash: params.messageHash, messageHash: params.messageHash,
}); });
// 预订阅消息流 // 预订阅消息流 - CRITICAL: Use signingPartyId (original from keygen)
if (tssHandler && 'prepareForSign' in tssHandler) { if (tssHandler && 'prepareForSign' in tssHandler) {
try { try {
debugLog.info('tss', `Preparing for sign: subscribing to messages for session ${params.sessionId}`); debugLog.info('tss', `Preparing for sign: subscribing to messages for session ${params.sessionId}, signingPartyId=${signingPartyId}`);
(tssHandler as TSSHandler).prepareForSign(params.sessionId, partyId); (tssHandler as TSSHandler).prepareForSign(params.sessionId, signingPartyId);
} catch (prepareErr) { } catch (prepareErr) {
debugLog.error('tss', `Failed to prepare for sign: ${(prepareErr as Error).message}`); debugLog.error('tss', `Failed to prepare for sign: ${(prepareErr as Error).message}`);
return { success: false, error: `消息订阅失败: ${(prepareErr as Error).message}` }; return { success: false, error: `消息订阅失败: ${(prepareErr as Error).message}` };

View File

@ -11,7 +11,12 @@ import {
getCurrentRpcUrl, getCurrentRpcUrl,
getGasPrice, getGasPrice,
fetchGreenPointsBalance, fetchGreenPointsBalance,
fetchEnergyPointsBalance,
fetchFuturePointsBalance,
GREEN_POINTS_TOKEN, GREEN_POINTS_TOKEN,
ENERGY_POINTS_TOKEN,
FUTURE_POINTS_TOKEN,
TOKEN_CONFIG,
type PreparedTransaction, type PreparedTransaction,
type TokenType, type TokenType,
} from '../utils/transaction'; } from '../utils/transaction';
@ -32,6 +37,8 @@ interface ShareWithAddress extends ShareItem {
evmAddress?: string; evmAddress?: string;
kavaBalance?: string; kavaBalance?: string;
greenPointsBalance?: string; greenPointsBalance?: string;
energyPointsBalance?: string;
futurePointsBalance?: string;
balanceLoading?: boolean; balanceLoading?: boolean;
} }
@ -89,15 +96,30 @@ export default function Home() {
const [isCalculatingMax, setIsCalculatingMax] = useState(false); const [isCalculatingMax, setIsCalculatingMax] = useState(false);
const [copySuccess, setCopySuccess] = useState(false); const [copySuccess, setCopySuccess] = useState(false);
// 获取当前选择代币的余额
const getTokenBalance = (share: ShareWithAddress | null, tokenType: TokenType): string => {
if (!share) return '0';
switch (tokenType) {
case 'KAVA':
return share.kavaBalance || '0';
case 'GREEN_POINTS':
return share.greenPointsBalance || '0';
case 'ENERGY_POINTS':
return share.energyPointsBalance || '0';
case 'FUTURE_POINTS':
return share.futurePointsBalance || '0';
}
};
// 计算扣除 Gas 费后的最大可转账金额 // 计算扣除 Gas 费后的最大可转账金额
const calculateMaxAmount = async () => { const calculateMaxAmount = async () => {
if (!transferShare?.evmAddress) return; if (!transferShare?.evmAddress) return;
setIsCalculatingMax(true); setIsCalculatingMax(true);
try { try {
if (transferTokenType === 'GREEN_POINTS') { if (TOKEN_CONFIG.isERC20(transferTokenType)) {
// For token transfers, use the full token balance (gas is paid in KAVA) // For ERC-20 token transfers, use the full token balance (gas is paid in KAVA)
const balance = transferShare.greenPointsBalance || '0'; const balance = getTokenBalance(transferShare, transferTokenType);
setTransferAmount(balance); setTransferAmount(balance);
setTransferError(null); setTransferError(null);
} else { } else {
@ -131,8 +153,8 @@ export default function Home() {
} }
} catch (error) { } catch (error) {
console.error('Failed to calculate max amount:', error); console.error('Failed to calculate max amount:', error);
if (transferTokenType === 'GREEN_POINTS') { if (TOKEN_CONFIG.isERC20(transferTokenType)) {
setTransferAmount(transferShare.greenPointsBalance || '0'); setTransferAmount(getTokenBalance(transferShare, transferTokenType));
} else { } else {
// 如果获取 Gas 失败,使用默认估算 (1 gwei * 21000) // 如果获取 Gas 失败,使用默认估算 (1 gwei * 21000)
const defaultGasFee = 0.000021; // ~21000 * 1 gwei const defaultGasFee = 0.000021; // ~21000 * 1 gwei
@ -165,12 +187,14 @@ export default function Home() {
const updatedShares = await Promise.all( const updatedShares = await Promise.all(
sharesWithAddrs.map(async (share) => { sharesWithAddrs.map(async (share) => {
if (share.evmAddress) { if (share.evmAddress) {
// Fetch both balances in parallel // Fetch all balances in parallel
const [kavaBalance, greenPointsBalance] = await Promise.all([ const [kavaBalance, greenPointsBalance, energyPointsBalance, futurePointsBalance] = await Promise.all([
fetchKavaBalance(share.evmAddress), fetchKavaBalance(share.evmAddress),
fetchGreenPointsBalance(share.evmAddress), fetchGreenPointsBalance(share.evmAddress),
fetchEnergyPointsBalance(share.evmAddress),
fetchFuturePointsBalance(share.evmAddress),
]); ]);
return { ...share, kavaBalance, greenPointsBalance, balanceLoading: false }; return { ...share, kavaBalance, greenPointsBalance, energyPointsBalance, futurePointsBalance, balanceLoading: false };
} }
return { ...share, balanceLoading: false }; return { ...share, balanceLoading: false };
}) })
@ -315,11 +339,7 @@ export default function Home() {
return '转账金额无效'; return '转账金额无效';
} }
const amount = parseFloat(transferAmount); const amount = parseFloat(transferAmount);
const balance = parseFloat( const balance = parseFloat(getTokenBalance(transferShare, transferTokenType));
transferTokenType === 'GREEN_POINTS'
? (transferShare?.greenPointsBalance || '0')
: (transferShare?.kavaBalance || '0')
);
if (amount > balance) { if (amount > balance) {
return '余额不足'; return '余额不足';
} }
@ -486,7 +506,7 @@ export default function Home() {
</div> </div>
)} )}
{/* 余额显示 - KAVA 和 绿积分 */} {/* 余额显示 - 所有代币 */}
{share.evmAddress && ( {share.evmAddress && (
<div className={styles.balanceSection}> <div className={styles.balanceSection}>
<div className={styles.balanceRow}> <div className={styles.balanceRow}>
@ -509,6 +529,26 @@ export default function Home() {
)} )}
</span> </span>
</div> </div>
<div className={styles.balanceRow}>
<span className={styles.balanceLabel} style={{ color: '#2196F3' }}>{ENERGY_POINTS_TOKEN.name}</span>
<span className={styles.balanceValue} style={{ color: '#2196F3' }}>
{share.balanceLoading ? (
<span className={styles.balanceLoading}>...</span>
) : (
<>{share.energyPointsBalance || '0'}</>
)}
</span>
</div>
<div className={styles.balanceRow}>
<span className={styles.balanceLabel} style={{ color: '#9C27B0' }}>{FUTURE_POINTS_TOKEN.name}</span>
<span className={styles.balanceValue} style={{ color: '#9C27B0' }}>
{share.balanceLoading ? (
<span className={styles.balanceLoading}>...</span>
) : (
<>{share.futurePointsBalance || '0'}</>
)}
</span>
</div>
</div> </div>
)} )}
@ -578,7 +618,10 @@ export default function Home() {
<div className={styles.transferWalletInfo}> <div className={styles.transferWalletInfo}>
<div className={styles.transferWalletName}>{transferShare.walletName}</div> <div className={styles.transferWalletName}>{transferShare.walletName}</div>
<div className={styles.transferWalletBalance}> <div className={styles.transferWalletBalance}>
KAVA: {transferShare.kavaBalance || '0'} | {GREEN_POINTS_TOKEN.name}: {transferShare.greenPointsBalance || '0'} KAVA: {transferShare.kavaBalance || '0'} | <span style={{color: '#4CAF50'}}>{GREEN_POINTS_TOKEN.name}: {transferShare.greenPointsBalance || '0'}</span>
</div>
<div className={styles.transferWalletBalance}>
<span style={{color: '#2196F3'}}>{ENERGY_POINTS_TOKEN.name}: {transferShare.energyPointsBalance || '0'}</span> | <span style={{color: '#9C27B0'}}>{FUTURE_POINTS_TOKEN.name}: {transferShare.futurePointsBalance || '0'}</span>
</div> </div>
<div className={styles.transferNetwork}> <div className={styles.transferNetwork}>
网络: Kava {getCurrentNetwork() === 'mainnet' ? '主网' : '测试网'} 网络: Kava {getCurrentNetwork() === 'mainnet' ? '主网' : '测试网'}
@ -605,6 +648,22 @@ export default function Home() {
{GREEN_POINTS_TOKEN.name} {GREEN_POINTS_TOKEN.name}
</button> </button>
</div> </div>
<div className={styles.tokenTypeSelector} style={{ marginTop: '8px' }}>
<button
className={`${styles.tokenTypeButton} ${transferTokenType === 'ENERGY_POINTS' ? styles.tokenTypeActive : ''}`}
onClick={() => { setTransferTokenType('ENERGY_POINTS'); setTransferAmount(''); }}
style={transferTokenType === 'ENERGY_POINTS' ? { backgroundColor: '#2196F3', borderColor: '#2196F3' } : {}}
>
{ENERGY_POINTS_TOKEN.name}
</button>
<button
className={`${styles.tokenTypeButton} ${transferTokenType === 'FUTURE_POINTS' ? styles.tokenTypeActive : ''}`}
onClick={() => { setTransferTokenType('FUTURE_POINTS'); setTransferAmount(''); }}
style={transferTokenType === 'FUTURE_POINTS' ? { backgroundColor: '#9C27B0', borderColor: '#9C27B0' } : {}}
>
{FUTURE_POINTS_TOKEN.name}
</button>
</div>
</div> </div>
{/* 收款地址 */} {/* 收款地址 */}
@ -622,7 +681,7 @@ export default function Home() {
{/* 转账金额 */} {/* 转账金额 */}
<div className={styles.transferInputGroup}> <div className={styles.transferInputGroup}>
<label className={styles.transferLabel}> <label className={styles.transferLabel}>
({transferTokenType === 'GREEN_POINTS' ? GREEN_POINTS_TOKEN.name : 'KAVA'}) ({TOKEN_CONFIG.getName(transferTokenType)})
</label> </label>
<div className={styles.transferAmountWrapper}> <div className={styles.transferAmountWrapper}>
<input <input
@ -689,8 +748,8 @@ export default function Home() {
<div className={styles.confirmDetails}> <div className={styles.confirmDetails}>
<div className={styles.confirmRow}> <div className={styles.confirmRow}>
<span className={styles.confirmLabel}></span> <span className={styles.confirmLabel}></span>
<span className={styles.confirmValue} style={transferTokenType === 'GREEN_POINTS' ? { color: '#4CAF50' } : {}}> <span className={styles.confirmValue} style={TOKEN_CONFIG.isERC20(transferTokenType) ? { color: transferTokenType === 'GREEN_POINTS' ? '#4CAF50' : transferTokenType === 'ENERGY_POINTS' ? '#2196F3' : '#9C27B0' } : {}}>
{transferTokenType === 'GREEN_POINTS' ? GREEN_POINTS_TOKEN.name : 'KAVA'} {TOKEN_CONFIG.getName(transferTokenType)}
</span> </span>
</div> </div>
<div className={styles.confirmRow}> <div className={styles.confirmRow}>
@ -699,8 +758,8 @@ export default function Home() {
</div> </div>
<div className={styles.confirmRow}> <div className={styles.confirmRow}>
<span className={styles.confirmLabel}></span> <span className={styles.confirmLabel}></span>
<span className={styles.confirmValue} style={transferTokenType === 'GREEN_POINTS' ? { color: '#4CAF50' } : {}}> <span className={styles.confirmValue} style={TOKEN_CONFIG.isERC20(transferTokenType) ? { color: transferTokenType === 'GREEN_POINTS' ? '#4CAF50' : transferTokenType === 'ENERGY_POINTS' ? '#2196F3' : '#9C27B0' } : {}}>
{transferAmount} {transferTokenType === 'GREEN_POINTS' ? GREEN_POINTS_TOKEN.name : 'KAVA'} {transferAmount} {TOKEN_CONFIG.getName(transferTokenType)}
</span> </span>
</div> </div>
<div className={styles.confirmRow}> <div className={styles.confirmRow}>

View File

@ -17,17 +17,97 @@ export const KAVA_RPC_URL = {
}; };
// Token types // Token types
export type TokenType = 'KAVA' | 'GREEN_POINTS'; export type TokenType = 'KAVA' | 'GREEN_POINTS' | 'ENERGY_POINTS' | 'FUTURE_POINTS';
// Green Points (绿积分) Token Configuration // ERC-20 通用函数选择器
export const ERC20_SELECTORS = {
balanceOf: '0x70a08231', // balanceOf(address)
transfer: '0xa9059cbb', // transfer(address,uint256)
approve: '0x095ea7b3', // approve(address,uint256)
allowance: '0xdd62ed3e', // allowance(address,address)
totalSupply: '0x18160ddd', // totalSupply()
};
// Green Points (绿积分) Token Configuration - dUSDT
export const GREEN_POINTS_TOKEN = { export const GREEN_POINTS_TOKEN = {
contractAddress: '0xA9F3A35dBa8699c8C681D8db03F0c1A8CEB9D7c3', contractAddress: '0xA9F3A35dBa8699c8C681D8db03F0c1A8CEB9D7c3',
name: '绿积分', name: '绿积分',
symbol: 'dUSDT', symbol: 'dUSDT',
decimals: 6, decimals: 6,
// ERC-20 function selectors // ERC-20 function selectors (kept for backward compatibility)
balanceOfSelector: '0x70a08231', balanceOfSelector: ERC20_SELECTORS.balanceOf,
transferSelector: '0xa9059cbb', transferSelector: ERC20_SELECTORS.transfer,
};
// Energy Points (积分股) Token Configuration - eUSDT
export const ENERGY_POINTS_TOKEN = {
contractAddress: '0x7C3275D808eFbAE90C06C7E3A9AfDdcAa8563931',
name: '积分股',
symbol: 'eUSDT',
decimals: 6,
};
// Future Points (积分值) Token Configuration - fUSDT
export const FUTURE_POINTS_TOKEN = {
contractAddress: '0x14dc4f7d3E4197438d058C3D156dd9826A161134',
name: '积分值',
symbol: 'fUSDT',
decimals: 6,
};
// Token configuration utility
export const TOKEN_CONFIG = {
getContractAddress: (tokenType: TokenType): string | null => {
switch (tokenType) {
case 'KAVA':
return null; // Native token has no contract
case 'GREEN_POINTS':
return GREEN_POINTS_TOKEN.contractAddress;
case 'ENERGY_POINTS':
return ENERGY_POINTS_TOKEN.contractAddress;
case 'FUTURE_POINTS':
return FUTURE_POINTS_TOKEN.contractAddress;
}
},
getDecimals: (tokenType: TokenType): number => {
switch (tokenType) {
case 'KAVA':
return 18;
case 'GREEN_POINTS':
return GREEN_POINTS_TOKEN.decimals;
case 'ENERGY_POINTS':
return ENERGY_POINTS_TOKEN.decimals;
case 'FUTURE_POINTS':
return FUTURE_POINTS_TOKEN.decimals;
}
},
getName: (tokenType: TokenType): string => {
switch (tokenType) {
case 'KAVA':
return 'KAVA';
case 'GREEN_POINTS':
return GREEN_POINTS_TOKEN.name;
case 'ENERGY_POINTS':
return ENERGY_POINTS_TOKEN.name;
case 'FUTURE_POINTS':
return FUTURE_POINTS_TOKEN.name;
}
},
getSymbol: (tokenType: TokenType): string => {
switch (tokenType) {
case 'KAVA':
return 'KAVA';
case 'GREEN_POINTS':
return GREEN_POINTS_TOKEN.symbol;
case 'ENERGY_POINTS':
return ENERGY_POINTS_TOKEN.symbol;
case 'FUTURE_POINTS':
return FUTURE_POINTS_TOKEN.symbol;
}
},
isERC20: (tokenType: TokenType): boolean => {
return tokenType !== 'KAVA';
},
}; };
// 当前网络配置 (从 localStorage 读取或使用默认值) // 当前网络配置 (从 localStorage 读取或使用默认值)
@ -327,44 +407,69 @@ export function weiToKava(wei: bigint): string {
} }
/** /**
* 绿 (6 decimals) *
* @param amount Human-readable amount
* @param decimals Token decimals (default 6 for USDT-like tokens)
*/ */
export function greenPointsToRaw(amount: string): bigint { export function tokenToRaw(amount: string, decimals: number = 6): bigint {
const parts = amount.split('.'); const parts = amount.split('.');
const whole = BigInt(parts[0] || '0'); const whole = BigInt(parts[0] || '0');
let fraction = parts[1] || ''; let fraction = parts[1] || '';
// 补齐或截断到 6 位 // 补齐或截断到指定位数
if (fraction.length > 6) { if (fraction.length > decimals) {
fraction = fraction.substring(0, 6); fraction = fraction.substring(0, decimals);
} else { } else {
fraction = fraction.padEnd(6, '0'); fraction = fraction.padEnd(decimals, '0');
} }
return whole * BigInt(10 ** 6) + BigInt(fraction); return whole * BigInt(10 ** decimals) + BigInt(fraction);
} }
/** /**
* 绿 *
* @param raw Raw amount in smallest units
* @param decimals Token decimals (default 6 for USDT-like tokens)
*/ */
export function rawToGreenPoints(raw: bigint): string { export function rawToToken(raw: bigint, decimals: number = 6): string {
const rawStr = raw.toString().padStart(7, '0'); const rawStr = raw.toString().padStart(decimals + 1, '0');
const whole = rawStr.slice(0, -6) || '0'; const whole = rawStr.slice(0, -decimals) || '0';
const fraction = rawStr.slice(-6).replace(/0+$/, ''); const fraction = rawStr.slice(-decimals).replace(/0+$/, '');
return fraction ? `${whole}.${fraction}` : whole; return fraction ? `${whole}.${fraction}` : whole;
} }
/** /**
* 绿 (ERC-20) * 绿 (6 decimals)
* @deprecated Use tokenToRaw(amount, 6) instead
*/ */
export async function fetchGreenPointsBalance(address: string): Promise<string> { export function greenPointsToRaw(amount: string): bigint {
return tokenToRaw(amount, GREEN_POINTS_TOKEN.decimals);
}
/**
* 绿
* @deprecated Use rawToToken(raw, 6) instead
*/
export function rawToGreenPoints(raw: bigint): string {
return rawToToken(raw, GREEN_POINTS_TOKEN.decimals);
}
/**
* ERC-20
* @param address Wallet address
* @param contractAddress Token contract address
* @param decimals Token decimals
*/
export async function fetchERC20Balance(
address: string,
contractAddress: string,
decimals: number = 6
): Promise<string> {
try { try {
const rpcUrl = getCurrentRpcUrl(); const rpcUrl = getCurrentRpcUrl();
// Encode balanceOf(address) call data // Encode balanceOf(address) call data
// Function selector: 0x70a08231
// Address parameter: padded to 32 bytes
const paddedAddress = address.toLowerCase().replace('0x', '').padStart(64, '0'); const paddedAddress = address.toLowerCase().replace('0x', '').padStart(64, '0');
const callData = GREEN_POINTS_TOKEN.balanceOfSelector + paddedAddress; const callData = ERC20_SELECTORS.balanceOf + paddedAddress;
const response = await fetch(rpcUrl, { const response = await fetch(rpcUrl, {
method: 'POST', method: 'POST',
@ -374,7 +479,7 @@ export async function fetchGreenPointsBalance(address: string): Promise<string>
method: 'eth_call', method: 'eth_call',
params: [ params: [
{ {
to: GREEN_POINTS_TOKEN.contractAddress, to: contractAddress,
data: callData, data: callData,
}, },
'latest', 'latest',
@ -386,21 +491,65 @@ export async function fetchGreenPointsBalance(address: string): Promise<string>
const data = await response.json(); const data = await response.json();
if (data.result && data.result !== '0x') { if (data.result && data.result !== '0x') {
const balanceRaw = BigInt(data.result); const balanceRaw = BigInt(data.result);
return rawToGreenPoints(balanceRaw); return rawToToken(balanceRaw, decimals);
} }
return '0'; return '0';
} catch (error) { } catch (error) {
console.error('Failed to fetch Green Points balance:', error); console.error('Failed to fetch ERC20 balance:', error);
return '0'; return '0';
} }
} }
/**
* 绿 (ERC-20)
*/
export async function fetchGreenPointsBalance(address: string): Promise<string> {
return fetchERC20Balance(address, GREEN_POINTS_TOKEN.contractAddress, GREEN_POINTS_TOKEN.decimals);
}
/**
* (eUSDT)
*/
export async function fetchEnergyPointsBalance(address: string): Promise<string> {
return fetchERC20Balance(address, ENERGY_POINTS_TOKEN.contractAddress, ENERGY_POINTS_TOKEN.decimals);
}
/**
* (fUSDT)
*/
export async function fetchFuturePointsBalance(address: string): Promise<string> {
return fetchERC20Balance(address, FUTURE_POINTS_TOKEN.contractAddress, FUTURE_POINTS_TOKEN.decimals);
}
/**
*
*/
export async function fetchAllTokenBalances(address: string): Promise<{
kava: string;
greenPoints: string;
energyPoints: string;
futurePoints: string;
}> {
const [greenPoints, energyPoints, futurePoints] = await Promise.all([
fetchGreenPointsBalance(address),
fetchEnergyPointsBalance(address),
fetchFuturePointsBalance(address),
]);
// Note: KAVA balance is fetched separately via eth_getBalance
return {
kava: '0', // Caller should fetch KAVA balance separately
greenPoints,
energyPoints,
futurePoints,
};
}
/** /**
* Encode ERC-20 transfer function call * Encode ERC-20 transfer function call
*/ */
function encodeErc20Transfer(to: string, amount: bigint): string { function encodeErc20Transfer(to: string, amount: bigint): string {
// Function selector: transfer(address,uint256) = 0xa9059cbb // Function selector: transfer(address,uint256) = 0xa9059cbb
const selector = GREEN_POINTS_TOKEN.transferSelector; const selector = ERC20_SELECTORS.transfer;
// Encode recipient address (padded to 32 bytes) // Encode recipient address (padded to 32 bytes)
const paddedAddress = to.toLowerCase().replace('0x', '').padStart(64, '0'); const paddedAddress = to.toLowerCase().replace('0x', '').padStart(64, '0');
// Encode amount (padded to 32 bytes) // Encode amount (padded to 32 bytes)
@ -476,13 +625,15 @@ export async function estimateGas(params: { from: string; to: string; value: str
// For token transfers, we need different params // For token transfers, we need different params
let txParams: { from: string; to: string; value: string; data?: string }; let txParams: { from: string; to: string; value: string; data?: string };
if (tokenType === 'GREEN_POINTS') { if (TOKEN_CONFIG.isERC20(tokenType)) {
// ERC-20 transfer: to is contract, value is 0, data is transfer call // ERC-20 transfer: to is contract, value is 0, data is transfer call
const tokenAmount = greenPointsToRaw(params.value); const contractAddress = TOKEN_CONFIG.getContractAddress(tokenType);
const decimals = TOKEN_CONFIG.getDecimals(tokenType);
const tokenAmount = tokenToRaw(params.value, decimals);
const transferData = encodeErc20Transfer(params.to, tokenAmount); const transferData = encodeErc20Transfer(params.to, tokenAmount);
txParams = { txParams = {
from: params.from, from: params.from,
to: GREEN_POINTS_TOKEN.contractAddress, to: contractAddress!,
value: '0x0', value: '0x0',
data: transferData, data: transferData,
}; };
@ -511,7 +662,7 @@ export async function estimateGas(params: { from: string; to: string; value: str
if (data.error) { if (data.error) {
// 如果估算失败,使用默认值 // 如果估算失败,使用默认值
console.warn('Gas 估算失败,使用默认值:', data.error); console.warn('Gas 估算失败,使用默认值:', data.error);
return tokenType === 'GREEN_POINTS' ? BigInt(65000) : BigInt(21000); return TOKEN_CONFIG.isERC20(tokenType) ? BigInt(65000) : BigInt(21000);
} }
return BigInt(data.result); return BigInt(data.result);
} }
@ -543,12 +694,14 @@ export async function prepareTransaction(params: TransactionParams): Promise<Pre
let value: bigint; let value: bigint;
let data: string; let data: string;
if (tokenType === 'GREEN_POINTS') { if (TOKEN_CONFIG.isERC20(tokenType)) {
// ERC-20 token transfer // ERC-20 token transfer
// To address is the contract, value is 0 // To address is the contract, value is 0
// Data is transfer(recipient, amount) encoded // Data is transfer(recipient, amount) encoded
const tokenAmount = greenPointsToRaw(params.value); const contractAddress = TOKEN_CONFIG.getContractAddress(tokenType);
toAddress = GREEN_POINTS_TOKEN.contractAddress.toLowerCase(); const decimals = TOKEN_CONFIG.getDecimals(tokenType);
const tokenAmount = tokenToRaw(params.value, decimals);
toAddress = contractAddress!.toLowerCase();
value = BigInt(0); value = BigInt(0);
data = encodeErc20Transfer(params.to, tokenAmount); data = encodeErc20Transfer(params.to, tokenAmount);
} else { } else {

View File

@ -1,7 +1,6 @@
-- ============================================================================ -- ============================================================================
-- auth-service 初始化 migration -- auth-service 初始化 migration
-- 合并自: 20260111000000_init, 20260111083500_allow_nullable_phone_password, -- 合并自: 0001_init, 0002_add_transactional_idempotency
-- 20260112110000_add_nickname_to_synced_legacy_users
-- ============================================================================ -- ============================================================================
-- CreateEnum -- CreateEnum
@ -241,3 +240,26 @@ ALTER TABLE "sms_logs" ADD CONSTRAINT "sms_logs_user_id_fkey" FOREIGN KEY ("user
-- AddForeignKey -- AddForeignKey
ALTER TABLE "login_logs" ADD CONSTRAINT "login_logs_user_id_fkey" FOREIGN KEY ("user_id") REFERENCES "users"("id") ON DELETE SET NULL ON UPDATE CASCADE; ALTER TABLE "login_logs" ADD CONSTRAINT "login_logs_user_id_fkey" FOREIGN KEY ("user_id") REFERENCES "users"("id") ON DELETE SET NULL ON UPDATE CASCADE;
-- ============================================================================
-- 事务性幂等消费支持 (从 0002_add_transactional_idempotency 合并)
-- 用于 1.0 -> 2.0 CDC 同步的 100% exactly-once 语义
-- ============================================================================
-- CreateTable
CREATE TABLE "processed_cdc_events" (
"id" BIGSERIAL NOT NULL,
"source_topic" TEXT NOT NULL,
"offset" BIGINT NOT NULL,
"table_name" TEXT NOT NULL,
"operation" TEXT NOT NULL,
"processed_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "processed_cdc_events_pkey" PRIMARY KEY ("id")
);
-- CreateIndex (复合唯一索引保证幂等性)
CREATE UNIQUE INDEX "processed_cdc_events_source_topic_offset_key" ON "processed_cdc_events"("source_topic", "offset");
-- CreateIndex (时间索引用于清理旧数据)
CREATE INDEX "processed_cdc_events_processed_at_idx" ON "processed_cdc_events"("processed_at");

View File

@ -1,25 +0,0 @@
-- ============================================================================
-- 添加事务性幂等消费支持
-- 用于 1.0 -> 2.0 CDC 同步的 100% exactly-once 语义
-- ============================================================================
-- 创建 processed_cdc_events 表(用于 CDC 事件幂等)
-- 唯一键: (source_topic, offset) - Kafka topic 名称 + 消息偏移量
-- 用于保证每个 CDC 事件只处理一次exactly-once 语义)
CREATE TABLE IF NOT EXISTS "processed_cdc_events" (
"id" BIGSERIAL NOT NULL,
"source_topic" VARCHAR(200) NOT NULL, -- Kafka topic 名称(如 cdc.identity.public.user_accounts
"offset" BIGINT NOT NULL, -- Kafka 消息偏移量(在 partition 内唯一)
"table_name" VARCHAR(100) NOT NULL, -- 源表名
"operation" VARCHAR(10) NOT NULL, -- CDC 操作类型: c(create), u(update), d(delete), r(snapshot read)
"processed_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "processed_cdc_events_pkey" PRIMARY KEY ("id")
);
-- 复合唯一索引:(source_topic, offset) 保证幂等性
-- 注意:这不是数据库自增 ID而是 Kafka 消息的唯一标识
CREATE UNIQUE INDEX "processed_cdc_events_source_topic_offset_key" ON "processed_cdc_events"("source_topic", "offset");
-- 时间索引用于清理旧数据
CREATE INDEX "processed_cdc_events_processed_at_idx" ON "processed_cdc_events"("processed_at");

View File

@ -22,7 +22,7 @@ class ChangePasswordDto {
newPassword: string; newPassword: string;
} }
@Controller('password') @Controller('auth/password')
@UseGuards(ThrottlerGuard) @UseGuards(ThrottlerGuard)
export class PasswordController { export class PasswordController {
constructor(private readonly passwordService: PasswordService) {} constructor(private readonly passwordService: PasswordService) {}

View File

@ -21,7 +21,7 @@ class VerifySmsDto {
type: 'REGISTER' | 'LOGIN' | 'RESET_PASSWORD' | 'CHANGE_PHONE'; type: 'REGISTER' | 'LOGIN' | 'RESET_PASSWORD' | 'CHANGE_PHONE';
} }
@Controller('sms') @Controller('auth/sms')
@UseGuards(ThrottlerGuard) @UseGuards(ThrottlerGuard)
export class SmsController { export class SmsController {
constructor(private readonly smsService: SmsService) {} constructor(private readonly smsService: SmsService) {}

View File

@ -7,7 +7,7 @@ import { UserService, UserProfileResult } from '@/application/services';
import { JwtAuthGuard } from '@/shared/guards/jwt-auth.guard'; import { JwtAuthGuard } from '@/shared/guards/jwt-auth.guard';
import { CurrentUser } from '@/shared/decorators/current-user.decorator'; import { CurrentUser } from '@/shared/decorators/current-user.decorator';
@Controller('user') @Controller('auth/user')
@UseGuards(JwtAuthGuard) @UseGuards(JwtAuthGuard)
export class UserController { export class UserController {
constructor(private readonly userService: UserService) {} constructor(private readonly userService: UserService) {}

View File

@ -9,7 +9,7 @@ import { InfrastructureModule } from './infrastructure/infrastructure.module';
// 配置模块 // 配置模块
ConfigModule.forRoot({ ConfigModule.forRoot({
isGlobal: true, isGlobal: true,
envFilePath: ['.env.local', '.env'], envFilePath: ['.env.local', '.env', '../.env'],
}), }),
// 限流模块 // 限流模块

View File

@ -0,0 +1,78 @@
// SPDX-License-Identifier: MIT
pragma solidity 0.8.19;
/**
* @title EnergyUSDT
* @dev Fixed supply ERC-20 token - NO MINTING CAPABILITY
* Total Supply: 10,002,000,000 (100.02 Billion) tokens with 6 decimals (matching USDT)
*
* IMPORTANT: This contract has NO mint function and NO way to increase supply.
* All tokens are minted to the deployer at construction time.
*/
contract EnergyUSDT {
string public constant name = "Energy USDT";
string public constant symbol = "eUSDT";
uint8 public constant decimals = 6;
// Fixed total supply: 100.02 billion tokens (10,002,000,000 * 10^6)
uint256 public constant totalSupply = 10_002_000_000 * 10**6;
mapping(address => uint256) private _balances;
mapping(address => mapping(address => uint256)) private _allowances;
event Transfer(address indexed from, address indexed to, uint256 value);
event Approval(address indexed owner, address indexed spender, uint256 value);
/**
* @dev Constructor - mints entire fixed supply to deployer
* No mint function exists - supply is permanently fixed
*/
constructor() {
_balances[msg.sender] = totalSupply;
emit Transfer(address(0), msg.sender, totalSupply);
}
function balanceOf(address account) public view returns (uint256) {
return _balances[account];
}
function transfer(address to, uint256 amount) public returns (bool) {
require(to != address(0), "Transfer to zero address");
require(_balances[msg.sender] >= amount, "Insufficient balance");
unchecked {
_balances[msg.sender] -= amount;
_balances[to] += amount;
}
emit Transfer(msg.sender, to, amount);
return true;
}
function allowance(address owner, address spender) public view returns (uint256) {
return _allowances[owner][spender];
}
function approve(address spender, uint256 amount) public returns (bool) {
require(spender != address(0), "Approve to zero address");
_allowances[msg.sender][spender] = amount;
emit Approval(msg.sender, spender, amount);
return true;
}
function transferFrom(address from, address to, uint256 amount) public returns (bool) {
require(from != address(0), "Transfer from zero address");
require(to != address(0), "Transfer to zero address");
require(_balances[from] >= amount, "Insufficient balance");
require(_allowances[from][msg.sender] >= amount, "Insufficient allowance");
unchecked {
_balances[from] -= amount;
_balances[to] += amount;
_allowances[from][msg.sender] -= amount;
}
emit Transfer(from, to, amount);
return true;
}
}

View File

@ -0,0 +1,81 @@
# eUSDT (Energy USDT)
## 代币信息
| 属性 | 值 |
|------|-----|
| 名称 | Energy USDT |
| 符号 | eUSDT |
| 精度 | 6 decimals |
| 总供应量 | 10,002,000,000 (100.02亿) |
| 标准 | ERC-20 |
| 部署链 | KAVA Mainnet (Chain ID: 2222) |
## 合约特性
- **固定供应量**100.02亿代币,部署时全部铸造给部署者
- **不可增发**:合约中没有 mint 函数,供应量永久固定
- **不可销毁**:合约层面无销毁功能
- **不可升级**:合约逻辑永久固定
- **标准ERC-20**完全兼容所有主流钱包和DEX
## 部署步骤
### 1. 安装依赖
```bash
cd backend/services/blockchain-service/contracts/eUSDT
npm install
```
### 2. 编译合约
```bash
node compile.mjs
```
编译后会在 `build/` 目录生成:
- `EnergyUSDT.abi` - 合约ABI
- `EnergyUSDT.bin` - 合约字节码
### 3. 部署合约
确保部署账户有足够的 KAVA 支付 gas 费(约 0.02 KAVA
```bash
node deploy.mjs
```
## 合约函数
| 函数 | 说明 |
|------|------|
| `name()` | 返回 "Energy USDT" |
| `symbol()` | 返回 "eUSDT" |
| `decimals()` | 返回 6 |
| `totalSupply()` | 返回 10,002,000,000 * 10^6 |
| `balanceOf(address)` | 查询账户余额 |
| `transfer(address, uint256)` | 转账 |
| `approve(address, uint256)` | 授权额度 |
| `transferFrom(address, address, uint256)` | 代理转账 |
| `allowance(address, address)` | 查询授权额度 |
## 事件
| 事件 | 说明 |
|------|------|
| `Transfer(from, to, value)` | 转账事件 |
| `Approval(owner, spender, value)` | 授权事件 |
## 部署信息
| 网络 | 合约地址 | 区块浏览器 |
|------|---------|-----------|
| KAVA Mainnet | `0x7C3275D808eFbAE90C06C7E3A9AfDdcAa8563931` | https://kavascan.com/address/0x7C3275D808eFbAE90C06C7E3A9AfDdcAa8563931 |
**部署详情:**
- 部署者/代币拥有者:`0x4F7E78d6B7C5FC502Ec7039848690f08c8970F1E`
- 私钥:`0x886ea4cffe76c386fecf3ff321ac9ae913737c46c17bc6ce2413752144668a2a`
- 初始持有量10,002,000,000 eUSDT全部代币
- 交易哈希:`0x5bebaa4a35378438ba5c891972024a1766935d2e01397a33502aa99e956a6b19`
- 部署时间2026-01-19

View File

@ -0,0 +1,51 @@
import solc from 'solc';
import fs from 'fs';
const source = fs.readFileSync('EnergyUSDT.sol', 'utf8');
const input = {
language: 'Solidity',
sources: {
'EnergyUSDT.sol': {
content: source
}
},
settings: {
optimizer: {
enabled: true,
runs: 200
},
evmVersion: 'paris', // Use paris to avoid PUSH0
outputSelection: {
'*': {
'*': ['abi', 'evm.bytecode']
}
}
}
};
const output = JSON.parse(solc.compile(JSON.stringify(input)));
if (output.errors) {
output.errors.forEach(err => {
console.log(err.formattedMessage);
});
// Check for actual errors (not just warnings)
const hasErrors = output.errors.some(err => err.severity === 'error');
if (hasErrors) {
process.exit(1);
}
}
const contract = output.contracts['EnergyUSDT.sol']['EnergyUSDT'];
const bytecode = contract.evm.bytecode.object;
const abi = contract.abi;
fs.mkdirSync('build', { recursive: true });
fs.writeFileSync('build/EnergyUSDT.bin', bytecode);
fs.writeFileSync('build/EnergyUSDT.abi', JSON.stringify(abi, null, 2));
console.log('Compiled successfully!');
console.log('Bytecode length:', bytecode.length);
console.log('ABI functions:', abi.filter(x => x.type === 'function').map(x => x.name).join(', '));

View File

@ -0,0 +1,86 @@
import { ethers } from 'ethers';
import fs from 'fs';
// Same deployer account as dUSDT
const PRIVATE_KEY = '0x886ea4cffe76c386fecf3ff321ac9ae913737c46c17bc6ce2413752144668a2a';
const RPC_URL = 'https://evm.kava.io';
// Contract bytecode
const BYTECODE = '0x' + fs.readFileSync('build/EnergyUSDT.bin', 'utf8');
const ABI = JSON.parse(fs.readFileSync('build/EnergyUSDT.abi', 'utf8'));
async function deploy() {
// Connect to Kava mainnet
const provider = new ethers.JsonRpcProvider(RPC_URL);
const wallet = new ethers.Wallet(PRIVATE_KEY, provider);
console.log('Deployer address:', wallet.address);
// Check balance
const balance = await provider.getBalance(wallet.address);
console.log('Balance:', ethers.formatEther(balance), 'KAVA');
if (parseFloat(ethers.formatEther(balance)) < 0.01) {
console.error('Insufficient KAVA balance for deployment!');
process.exit(1);
}
// Get network info
const network = await provider.getNetwork();
console.log('Chain ID:', network.chainId.toString());
// Create contract factory
const factory = new ethers.ContractFactory(ABI, BYTECODE, wallet);
console.log('Deploying EnergyUSDT (eUSDT) contract...');
// Deploy
const contract = await factory.deploy();
console.log('Transaction hash:', contract.deploymentTransaction().hash);
// Wait for deployment
console.log('Waiting for confirmation...');
await contract.waitForDeployment();
const contractAddress = await contract.getAddress();
console.log('Contract deployed at:', contractAddress);
// Verify deployment
console.log('\nVerifying deployment...');
const name = await contract.name();
const symbol = await contract.symbol();
const decimals = await contract.decimals();
const totalSupply = await contract.totalSupply();
const ownerBalance = await contract.balanceOf(wallet.address);
console.log('Token name:', name);
console.log('Token symbol:', symbol);
console.log('Decimals:', decimals.toString());
console.log('Total supply:', ethers.formatUnits(totalSupply, 6), 'eUSDT');
console.log('Owner balance:', ethers.formatUnits(ownerBalance, 6), 'eUSDT');
console.log('\n=== DEPLOYMENT COMPLETE ===');
console.log('Contract Address:', contractAddress);
console.log('Explorer:', `https://kavascan.com/address/${contractAddress}`);
// Save deployment info
const deploymentInfo = {
network: 'KAVA Mainnet',
chainId: 2222,
contractAddress,
deployer: wallet.address,
transactionHash: contract.deploymentTransaction().hash,
deployedAt: new Date().toISOString(),
token: {
name,
symbol,
decimals: decimals.toString(),
totalSupply: totalSupply.toString()
}
};
fs.writeFileSync('deployment.json', JSON.stringify(deploymentInfo, null, 2));
console.log('\nDeployment info saved to deployment.json');
}
deploy().catch(console.error);

View File

@ -0,0 +1,14 @@
{
"network": "KAVA Mainnet",
"chainId": 2222,
"contractAddress": "0x7C3275D808eFbAE90C06C7E3A9AfDdcAa8563931",
"deployer": "0x4F7E78d6B7C5FC502Ec7039848690f08c8970F1E",
"transactionHash": "0x5bebaa4a35378438ba5c891972024a1766935d2e01397a33502aa99e956a6b19",
"deployedAt": "2026-01-19T13:25:28.071Z",
"token": {
"name": "Energy USDT",
"symbol": "eUSDT",
"decimals": "6",
"totalSupply": "10002000000000000"
}
}

View File

@ -0,0 +1,222 @@
{
"name": "eusdt-contract",
"version": "1.0.0",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "eusdt-contract",
"version": "1.0.0",
"dependencies": {
"ethers": "^6.9.0",
"solc": "^0.8.19"
}
},
"node_modules/@adraffy/ens-normalize": {
"version": "1.10.1",
"resolved": "https://registry.npmjs.org/@adraffy/ens-normalize/-/ens-normalize-1.10.1.tgz",
"integrity": "sha512-96Z2IP3mYmF1Xg2cDm8f1gWGf/HUVedQ3FMifV4kG/PQ4yEP51xDtRAEfhVNt5f/uzpNkZHwWQuUcu6D6K+Ekw==",
"license": "MIT"
},
"node_modules/@noble/curves": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/@noble/curves/-/curves-1.2.0.tgz",
"integrity": "sha512-oYclrNgRaM9SsBUBVbb8M6DTV7ZHRTKugureoYEncY5c65HOmRzvSiTE3y5CYaPYJA/GVkrhXEoF0M3Ya9PMnw==",
"license": "MIT",
"dependencies": {
"@noble/hashes": "1.3.2"
},
"funding": {
"url": "https://paulmillr.com/funding/"
}
},
"node_modules/@noble/hashes": {
"version": "1.3.2",
"resolved": "https://registry.npmjs.org/@noble/hashes/-/hashes-1.3.2.tgz",
"integrity": "sha512-MVC8EAQp7MvEcm30KWENFjgR+Mkmf+D189XJTkFIlwohU5hcBbn1ZkKq7KVTi2Hme3PMGF390DaL52beVrIihQ==",
"license": "MIT",
"engines": {
"node": ">= 16"
},
"funding": {
"url": "https://paulmillr.com/funding/"
}
},
"node_modules/@types/node": {
"version": "22.7.5",
"resolved": "https://registry.npmjs.org/@types/node/-/node-22.7.5.tgz",
"integrity": "sha512-jML7s2NAzMWc//QSJ1a3prpk78cOPchGvXJsC3C6R6PSMoooztvRVQEz89gmBTBY1SPMaqo5teB4uNHPdetShQ==",
"license": "MIT",
"dependencies": {
"undici-types": "~6.19.2"
}
},
"node_modules/aes-js": {
"version": "4.0.0-beta.5",
"resolved": "https://registry.npmjs.org/aes-js/-/aes-js-4.0.0-beta.5.tgz",
"integrity": "sha512-G965FqalsNyrPqgEGON7nIx1e/OVENSgiEIzyC63haUMuvNnwIgIjMs52hlTCKhkBny7A2ORNlfY9Zu+jmGk1Q==",
"license": "MIT"
},
"node_modules/command-exists": {
"version": "1.2.9",
"resolved": "https://registry.npmjs.org/command-exists/-/command-exists-1.2.9.tgz",
"integrity": "sha512-LTQ/SGc+s0Xc0Fu5WaKnR0YiygZkm9eKFvyS+fRsU7/ZWFF8ykFM6Pc9aCVf1+xasOOZpO3BAVgVrKvsqKHV7w==",
"license": "MIT"
},
"node_modules/commander": {
"version": "8.3.0",
"resolved": "https://registry.npmjs.org/commander/-/commander-8.3.0.tgz",
"integrity": "sha512-OkTL9umf+He2DZkUq8f8J9of7yL6RJKI24dVITBmNfZBmri9zYZQrKkuXiKhyfPSu8tUhnVBB1iKXevvnlR4Ww==",
"license": "MIT",
"engines": {
"node": ">= 12"
}
},
"node_modules/ethers": {
"version": "6.16.0",
"resolved": "https://registry.npmjs.org/ethers/-/ethers-6.16.0.tgz",
"integrity": "sha512-U1wulmetNymijEhpSEQ7Ct/P/Jw9/e7R1j5XIbPRydgV2DjLVMsULDlNksq3RQnFgKoLlZf88ijYtWEXcPa07A==",
"funding": [
{
"type": "individual",
"url": "https://github.com/sponsors/ethers-io/"
},
{
"type": "individual",
"url": "https://www.buymeacoffee.com/ricmoo"
}
],
"license": "MIT",
"dependencies": {
"@adraffy/ens-normalize": "1.10.1",
"@noble/curves": "1.2.0",
"@noble/hashes": "1.3.2",
"@types/node": "22.7.5",
"aes-js": "4.0.0-beta.5",
"tslib": "2.7.0",
"ws": "8.17.1"
},
"engines": {
"node": ">=14.0.0"
}
},
"node_modules/follow-redirects": {
"version": "1.15.11",
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.11.tgz",
"integrity": "sha512-deG2P0JfjrTxl50XGCDyfI97ZGVCxIpfKYmfyrQ54n5FO/0gfIES8C/Psl6kWVDolizcaaxZJnTS0QSMxvnsBQ==",
"funding": [
{
"type": "individual",
"url": "https://github.com/sponsors/RubenVerborgh"
}
],
"license": "MIT",
"engines": {
"node": ">=4.0"
},
"peerDependenciesMeta": {
"debug": {
"optional": true
}
}
},
"node_modules/js-sha3": {
"version": "0.8.0",
"resolved": "https://registry.npmjs.org/js-sha3/-/js-sha3-0.8.0.tgz",
"integrity": "sha512-gF1cRrHhIzNfToc802P800N8PpXS+evLLXfsVpowqmAFR9uwbi89WvXg2QspOmXL8QL86J4T1EpFu+yUkwJY3Q==",
"license": "MIT"
},
"node_modules/memorystream": {
"version": "0.3.1",
"resolved": "https://registry.npmjs.org/memorystream/-/memorystream-0.3.1.tgz",
"integrity": "sha512-S3UwM3yj5mtUSEfP41UZmt/0SCoVYUcU1rkXv+BQ5Ig8ndL4sPoJNBUJERafdPb5jjHJGuMgytgKvKIf58XNBw==",
"engines": {
"node": ">= 0.10.0"
}
},
"node_modules/os-tmpdir": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/os-tmpdir/-/os-tmpdir-1.0.2.tgz",
"integrity": "sha512-D2FR03Vir7FIu45XBY20mTb+/ZSWB00sjU9jdQXt83gDrI4Ztz5Fs7/yy74g2N5SVQY4xY1qDr4rNddwYRVX0g==",
"license": "MIT",
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/semver": {
"version": "5.7.2",
"resolved": "https://registry.npmjs.org/semver/-/semver-5.7.2.tgz",
"integrity": "sha512-cBznnQ9KjJqU67B52RMC65CMarK2600WFnbkcaiwWq3xy/5haFJlshgnpjovMVJ+Hff49d8GEn0b87C5pDQ10g==",
"license": "ISC",
"bin": {
"semver": "bin/semver"
}
},
"node_modules/solc": {
"version": "0.8.19",
"resolved": "https://registry.npmjs.org/solc/-/solc-0.8.19.tgz",
"integrity": "sha512-yqurS3wzC4LdEvmMobODXqprV4MYJcVtinuxgrp61ac8K2zz40vXA0eSAskSHPgv8dQo7Nux39i3QBsHx4pqyA==",
"license": "MIT",
"dependencies": {
"command-exists": "^1.2.8",
"commander": "^8.1.0",
"follow-redirects": "^1.12.1",
"js-sha3": "0.8.0",
"memorystream": "^0.3.1",
"semver": "^5.5.0",
"tmp": "0.0.33"
},
"bin": {
"solcjs": "solc.js"
},
"engines": {
"node": ">=10.0.0"
}
},
"node_modules/tmp": {
"version": "0.0.33",
"resolved": "https://registry.npmjs.org/tmp/-/tmp-0.0.33.tgz",
"integrity": "sha512-jRCJlojKnZ3addtTOjdIqoRuPEKBvNXcGYqzO6zWZX8KfKEpnGY5jfggJQ3EjKuu8D4bJRr0y+cYJFmYbImXGw==",
"license": "MIT",
"dependencies": {
"os-tmpdir": "~1.0.2"
},
"engines": {
"node": ">=0.6.0"
}
},
"node_modules/tslib": {
"version": "2.7.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.7.0.tgz",
"integrity": "sha512-gLXCKdN1/j47AiHiOkJN69hJmcbGTHI0ImLmbYLHykhgeN0jVGola9yVjFgzCUklsZQMW55o+dW7IXv3RCXDzA==",
"license": "0BSD"
},
"node_modules/undici-types": {
"version": "6.19.8",
"resolved": "https://registry.npmjs.org/undici-types/-/undici-types-6.19.8.tgz",
"integrity": "sha512-ve2KP6f/JnbPBFyobGHuerC9g1FYGn/F8n1LWTwNxCEzd6IfqTwUQcNXgEtmmQ6DlRrC1hrSrBnCZPokRrDHjw==",
"license": "MIT"
},
"node_modules/ws": {
"version": "8.17.1",
"resolved": "https://registry.npmjs.org/ws/-/ws-8.17.1.tgz",
"integrity": "sha512-6XQFvXTkbfUOZOKKILFG1PDK2NDQs4azKQl26T0YS5CxqWLgXajbPZ+h4gZekJyRqFU8pvnbAbbs/3TgRPy+GQ==",
"license": "MIT",
"engines": {
"node": ">=10.0.0"
},
"peerDependencies": {
"bufferutil": "^4.0.1",
"utf-8-validate": ">=5.0.2"
},
"peerDependenciesMeta": {
"bufferutil": {
"optional": true
},
"utf-8-validate": {
"optional": true
}
}
}
}
}

View File

@ -0,0 +1,14 @@
{
"name": "eusdt-contract",
"version": "1.0.0",
"type": "module",
"description": "Energy USDT (eUSDT) ERC-20 Token Contract",
"scripts": {
"compile": "node compile.mjs",
"deploy": "node deploy.mjs"
},
"dependencies": {
"ethers": "^6.9.0",
"solc": "^0.8.19"
}
}

View File

@ -0,0 +1,78 @@
// SPDX-License-Identifier: MIT
pragma solidity 0.8.19;
/**
* @title FutureUSDT
* @dev Fixed supply ERC-20 token - NO MINTING CAPABILITY
* Total Supply: 1,000,000,000,000 (1 Trillion) tokens with 6 decimals (matching USDT)
*
* IMPORTANT: This contract has NO mint function and NO way to increase supply.
* All tokens are minted to the deployer at construction time.
*/
contract FutureUSDT {
string public constant name = "Future USDT";
string public constant symbol = "fUSDT";
uint8 public constant decimals = 6;
// Fixed total supply: 1 trillion tokens (1,000,000,000,000 * 10^6)
uint256 public constant totalSupply = 1_000_000_000_000 * 10**6;
mapping(address => uint256) private _balances;
mapping(address => mapping(address => uint256)) private _allowances;
event Transfer(address indexed from, address indexed to, uint256 value);
event Approval(address indexed owner, address indexed spender, uint256 value);
/**
* @dev Constructor - mints entire fixed supply to deployer
* No mint function exists - supply is permanently fixed
*/
constructor() {
_balances[msg.sender] = totalSupply;
emit Transfer(address(0), msg.sender, totalSupply);
}
function balanceOf(address account) public view returns (uint256) {
return _balances[account];
}
function transfer(address to, uint256 amount) public returns (bool) {
require(to != address(0), "Transfer to zero address");
require(_balances[msg.sender] >= amount, "Insufficient balance");
unchecked {
_balances[msg.sender] -= amount;
_balances[to] += amount;
}
emit Transfer(msg.sender, to, amount);
return true;
}
function allowance(address owner, address spender) public view returns (uint256) {
return _allowances[owner][spender];
}
function approve(address spender, uint256 amount) public returns (bool) {
require(spender != address(0), "Approve to zero address");
_allowances[msg.sender][spender] = amount;
emit Approval(msg.sender, spender, amount);
return true;
}
function transferFrom(address from, address to, uint256 amount) public returns (bool) {
require(from != address(0), "Transfer from zero address");
require(to != address(0), "Transfer to zero address");
require(_balances[from] >= amount, "Insufficient balance");
require(_allowances[from][msg.sender] >= amount, "Insufficient allowance");
unchecked {
_balances[from] -= amount;
_balances[to] += amount;
_allowances[from][msg.sender] -= amount;
}
emit Transfer(from, to, amount);
return true;
}
}

View File

@ -0,0 +1,81 @@
# fUSDT (Future USDT)
## 代币信息
| 属性 | 值 |
|------|-----|
| 名称 | Future USDT |
| 符号 | fUSDT |
| 精度 | 6 decimals |
| 总供应量 | 1,000,000,000,000 (1万亿) |
| 标准 | ERC-20 |
| 部署链 | KAVA Mainnet (Chain ID: 2222) |
## 合约特性
- **固定供应量**1万亿代币部署时全部铸造给部署者
- **不可增发**:合约中没有 mint 函数,供应量永久固定
- **不可销毁**:合约层面无销毁功能
- **不可升级**:合约逻辑永久固定
- **标准ERC-20**完全兼容所有主流钱包和DEX
## 部署步骤
### 1. 安装依赖
```bash
cd backend/services/blockchain-service/contracts/fUSDT
npm install
```
### 2. 编译合约
```bash
node compile.mjs
```
编译后会在 `build/` 目录生成:
- `FutureUSDT.abi` - 合约ABI
- `FutureUSDT.bin` - 合约字节码
### 3. 部署合约
确保部署账户有足够的 KAVA 支付 gas 费(约 0.02 KAVA
```bash
node deploy.mjs
```
## 合约函数
| 函数 | 说明 |
|------|------|
| `name()` | 返回 "Future USDT" |
| `symbol()` | 返回 "fUSDT" |
| `decimals()` | 返回 6 |
| `totalSupply()` | 返回 1,000,000,000,000 * 10^6 |
| `balanceOf(address)` | 查询账户余额 |
| `transfer(address, uint256)` | 转账 |
| `approve(address, uint256)` | 授权额度 |
| `transferFrom(address, address, uint256)` | 代理转账 |
| `allowance(address, address)` | 查询授权额度 |
## 事件
| 事件 | 说明 |
|------|------|
| `Transfer(from, to, value)` | 转账事件 |
| `Approval(owner, spender, value)` | 授权事件 |
## 部署信息
| 网络 | 合约地址 | 区块浏览器 |
|------|---------|-----------|
| KAVA Mainnet | `0x14dc4f7d3E4197438d058C3D156dd9826A161134` | https://kavascan.com/address/0x14dc4f7d3E4197438d058C3D156dd9826A161134 |
**部署详情:**
- 部署者/代币拥有者:`0x4F7E78d6B7C5FC502Ec7039848690f08c8970F1E`
- 私钥:`0x886ea4cffe76c386fecf3ff321ac9ae913737c46c17bc6ce2413752144668a2a`
- 初始持有量1,000,000,000,000 fUSDT全部代币
- 交易哈希:`0x071f535971bc3a134dd26c182b6f05c53f0c3783e91fe6ef471d6c914e4cdb06`
- 部署时间2026-01-19

View File

@ -0,0 +1,51 @@
import solc from 'solc';
import fs from 'fs';
const source = fs.readFileSync('FutureUSDT.sol', 'utf8');
const input = {
language: 'Solidity',
sources: {
'FutureUSDT.sol': {
content: source
}
},
settings: {
optimizer: {
enabled: true,
runs: 200
},
evmVersion: 'paris', // Use paris to avoid PUSH0
outputSelection: {
'*': {
'*': ['abi', 'evm.bytecode']
}
}
}
};
const output = JSON.parse(solc.compile(JSON.stringify(input)));
if (output.errors) {
output.errors.forEach(err => {
console.log(err.formattedMessage);
});
// Check for actual errors (not just warnings)
const hasErrors = output.errors.some(err => err.severity === 'error');
if (hasErrors) {
process.exit(1);
}
}
const contract = output.contracts['FutureUSDT.sol']['FutureUSDT'];
const bytecode = contract.evm.bytecode.object;
const abi = contract.abi;
fs.mkdirSync('build', { recursive: true });
fs.writeFileSync('build/FutureUSDT.bin', bytecode);
fs.writeFileSync('build/FutureUSDT.abi', JSON.stringify(abi, null, 2));
console.log('Compiled successfully!');
console.log('Bytecode length:', bytecode.length);
console.log('ABI functions:', abi.filter(x => x.type === 'function').map(x => x.name).join(', '));

View File

@ -0,0 +1,86 @@
import { ethers } from 'ethers';
import fs from 'fs';
// Same deployer account as dUSDT
const PRIVATE_KEY = '0x886ea4cffe76c386fecf3ff321ac9ae913737c46c17bc6ce2413752144668a2a';
const RPC_URL = 'https://evm.kava.io';
// Contract bytecode
const BYTECODE = '0x' + fs.readFileSync('build/FutureUSDT.bin', 'utf8');
const ABI = JSON.parse(fs.readFileSync('build/FutureUSDT.abi', 'utf8'));
async function deploy() {
// Connect to Kava mainnet
const provider = new ethers.JsonRpcProvider(RPC_URL);
const wallet = new ethers.Wallet(PRIVATE_KEY, provider);
console.log('Deployer address:', wallet.address);
// Check balance
const balance = await provider.getBalance(wallet.address);
console.log('Balance:', ethers.formatEther(balance), 'KAVA');
if (parseFloat(ethers.formatEther(balance)) < 0.01) {
console.error('Insufficient KAVA balance for deployment!');
process.exit(1);
}
// Get network info
const network = await provider.getNetwork();
console.log('Chain ID:', network.chainId.toString());
// Create contract factory
const factory = new ethers.ContractFactory(ABI, BYTECODE, wallet);
console.log('Deploying FutureUSDT (fUSDT) contract...');
// Deploy
const contract = await factory.deploy();
console.log('Transaction hash:', contract.deploymentTransaction().hash);
// Wait for deployment
console.log('Waiting for confirmation...');
await contract.waitForDeployment();
const contractAddress = await contract.getAddress();
console.log('Contract deployed at:', contractAddress);
// Verify deployment
console.log('\nVerifying deployment...');
const name = await contract.name();
const symbol = await contract.symbol();
const decimals = await contract.decimals();
const totalSupply = await contract.totalSupply();
const ownerBalance = await contract.balanceOf(wallet.address);
console.log('Token name:', name);
console.log('Token symbol:', symbol);
console.log('Decimals:', decimals.toString());
console.log('Total supply:', ethers.formatUnits(totalSupply, 6), 'fUSDT');
console.log('Owner balance:', ethers.formatUnits(ownerBalance, 6), 'fUSDT');
console.log('\n=== DEPLOYMENT COMPLETE ===');
console.log('Contract Address:', contractAddress);
console.log('Explorer:', `https://kavascan.com/address/${contractAddress}`);
// Save deployment info
const deploymentInfo = {
network: 'KAVA Mainnet',
chainId: 2222,
contractAddress,
deployer: wallet.address,
transactionHash: contract.deploymentTransaction().hash,
deployedAt: new Date().toISOString(),
token: {
name,
symbol,
decimals: decimals.toString(),
totalSupply: totalSupply.toString()
}
};
fs.writeFileSync('deployment.json', JSON.stringify(deploymentInfo, null, 2));
console.log('\nDeployment info saved to deployment.json');
}
deploy().catch(console.error);

View File

@ -0,0 +1,14 @@
{
"network": "KAVA Mainnet",
"chainId": 2222,
"contractAddress": "0x14dc4f7d3E4197438d058C3D156dd9826A161134",
"deployer": "0x4F7E78d6B7C5FC502Ec7039848690f08c8970F1E",
"transactionHash": "0x071f535971bc3a134dd26c182b6f05c53f0c3783e91fe6ef471d6c914e4cdb06",
"deployedAt": "2026-01-19T13:26:05.111Z",
"token": {
"name": "Future USDT",
"symbol": "fUSDT",
"decimals": "6",
"totalSupply": "1000000000000000000"
}
}

View File

@ -0,0 +1,222 @@
{
"name": "fusdt-contract",
"version": "1.0.0",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "fusdt-contract",
"version": "1.0.0",
"dependencies": {
"ethers": "^6.9.0",
"solc": "^0.8.19"
}
},
"node_modules/@adraffy/ens-normalize": {
"version": "1.10.1",
"resolved": "https://registry.npmjs.org/@adraffy/ens-normalize/-/ens-normalize-1.10.1.tgz",
"integrity": "sha512-96Z2IP3mYmF1Xg2cDm8f1gWGf/HUVedQ3FMifV4kG/PQ4yEP51xDtRAEfhVNt5f/uzpNkZHwWQuUcu6D6K+Ekw==",
"license": "MIT"
},
"node_modules/@noble/curves": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/@noble/curves/-/curves-1.2.0.tgz",
"integrity": "sha512-oYclrNgRaM9SsBUBVbb8M6DTV7ZHRTKugureoYEncY5c65HOmRzvSiTE3y5CYaPYJA/GVkrhXEoF0M3Ya9PMnw==",
"license": "MIT",
"dependencies": {
"@noble/hashes": "1.3.2"
},
"funding": {
"url": "https://paulmillr.com/funding/"
}
},
"node_modules/@noble/hashes": {
"version": "1.3.2",
"resolved": "https://registry.npmjs.org/@noble/hashes/-/hashes-1.3.2.tgz",
"integrity": "sha512-MVC8EAQp7MvEcm30KWENFjgR+Mkmf+D189XJTkFIlwohU5hcBbn1ZkKq7KVTi2Hme3PMGF390DaL52beVrIihQ==",
"license": "MIT",
"engines": {
"node": ">= 16"
},
"funding": {
"url": "https://paulmillr.com/funding/"
}
},
"node_modules/@types/node": {
"version": "22.7.5",
"resolved": "https://registry.npmjs.org/@types/node/-/node-22.7.5.tgz",
"integrity": "sha512-jML7s2NAzMWc//QSJ1a3prpk78cOPchGvXJsC3C6R6PSMoooztvRVQEz89gmBTBY1SPMaqo5teB4uNHPdetShQ==",
"license": "MIT",
"dependencies": {
"undici-types": "~6.19.2"
}
},
"node_modules/aes-js": {
"version": "4.0.0-beta.5",
"resolved": "https://registry.npmjs.org/aes-js/-/aes-js-4.0.0-beta.5.tgz",
"integrity": "sha512-G965FqalsNyrPqgEGON7nIx1e/OVENSgiEIzyC63haUMuvNnwIgIjMs52hlTCKhkBny7A2ORNlfY9Zu+jmGk1Q==",
"license": "MIT"
},
"node_modules/command-exists": {
"version": "1.2.9",
"resolved": "https://registry.npmjs.org/command-exists/-/command-exists-1.2.9.tgz",
"integrity": "sha512-LTQ/SGc+s0Xc0Fu5WaKnR0YiygZkm9eKFvyS+fRsU7/ZWFF8ykFM6Pc9aCVf1+xasOOZpO3BAVgVrKvsqKHV7w==",
"license": "MIT"
},
"node_modules/commander": {
"version": "8.3.0",
"resolved": "https://registry.npmjs.org/commander/-/commander-8.3.0.tgz",
"integrity": "sha512-OkTL9umf+He2DZkUq8f8J9of7yL6RJKI24dVITBmNfZBmri9zYZQrKkuXiKhyfPSu8tUhnVBB1iKXevvnlR4Ww==",
"license": "MIT",
"engines": {
"node": ">= 12"
}
},
"node_modules/ethers": {
"version": "6.16.0",
"resolved": "https://registry.npmjs.org/ethers/-/ethers-6.16.0.tgz",
"integrity": "sha512-U1wulmetNymijEhpSEQ7Ct/P/Jw9/e7R1j5XIbPRydgV2DjLVMsULDlNksq3RQnFgKoLlZf88ijYtWEXcPa07A==",
"funding": [
{
"type": "individual",
"url": "https://github.com/sponsors/ethers-io/"
},
{
"type": "individual",
"url": "https://www.buymeacoffee.com/ricmoo"
}
],
"license": "MIT",
"dependencies": {
"@adraffy/ens-normalize": "1.10.1",
"@noble/curves": "1.2.0",
"@noble/hashes": "1.3.2",
"@types/node": "22.7.5",
"aes-js": "4.0.0-beta.5",
"tslib": "2.7.0",
"ws": "8.17.1"
},
"engines": {
"node": ">=14.0.0"
}
},
"node_modules/follow-redirects": {
"version": "1.15.11",
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.11.tgz",
"integrity": "sha512-deG2P0JfjrTxl50XGCDyfI97ZGVCxIpfKYmfyrQ54n5FO/0gfIES8C/Psl6kWVDolizcaaxZJnTS0QSMxvnsBQ==",
"funding": [
{
"type": "individual",
"url": "https://github.com/sponsors/RubenVerborgh"
}
],
"license": "MIT",
"engines": {
"node": ">=4.0"
},
"peerDependenciesMeta": {
"debug": {
"optional": true
}
}
},
"node_modules/js-sha3": {
"version": "0.8.0",
"resolved": "https://registry.npmjs.org/js-sha3/-/js-sha3-0.8.0.tgz",
"integrity": "sha512-gF1cRrHhIzNfToc802P800N8PpXS+evLLXfsVpowqmAFR9uwbi89WvXg2QspOmXL8QL86J4T1EpFu+yUkwJY3Q==",
"license": "MIT"
},
"node_modules/memorystream": {
"version": "0.3.1",
"resolved": "https://registry.npmjs.org/memorystream/-/memorystream-0.3.1.tgz",
"integrity": "sha512-S3UwM3yj5mtUSEfP41UZmt/0SCoVYUcU1rkXv+BQ5Ig8ndL4sPoJNBUJERafdPb5jjHJGuMgytgKvKIf58XNBw==",
"engines": {
"node": ">= 0.10.0"
}
},
"node_modules/os-tmpdir": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/os-tmpdir/-/os-tmpdir-1.0.2.tgz",
"integrity": "sha512-D2FR03Vir7FIu45XBY20mTb+/ZSWB00sjU9jdQXt83gDrI4Ztz5Fs7/yy74g2N5SVQY4xY1qDr4rNddwYRVX0g==",
"license": "MIT",
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/semver": {
"version": "5.7.2",
"resolved": "https://registry.npmjs.org/semver/-/semver-5.7.2.tgz",
"integrity": "sha512-cBznnQ9KjJqU67B52RMC65CMarK2600WFnbkcaiwWq3xy/5haFJlshgnpjovMVJ+Hff49d8GEn0b87C5pDQ10g==",
"license": "ISC",
"bin": {
"semver": "bin/semver"
}
},
"node_modules/solc": {
"version": "0.8.19",
"resolved": "https://registry.npmjs.org/solc/-/solc-0.8.19.tgz",
"integrity": "sha512-yqurS3wzC4LdEvmMobODXqprV4MYJcVtinuxgrp61ac8K2zz40vXA0eSAskSHPgv8dQo7Nux39i3QBsHx4pqyA==",
"license": "MIT",
"dependencies": {
"command-exists": "^1.2.8",
"commander": "^8.1.0",
"follow-redirects": "^1.12.1",
"js-sha3": "0.8.0",
"memorystream": "^0.3.1",
"semver": "^5.5.0",
"tmp": "0.0.33"
},
"bin": {
"solcjs": "solc.js"
},
"engines": {
"node": ">=10.0.0"
}
},
"node_modules/tmp": {
"version": "0.0.33",
"resolved": "https://registry.npmjs.org/tmp/-/tmp-0.0.33.tgz",
"integrity": "sha512-jRCJlojKnZ3addtTOjdIqoRuPEKBvNXcGYqzO6zWZX8KfKEpnGY5jfggJQ3EjKuu8D4bJRr0y+cYJFmYbImXGw==",
"license": "MIT",
"dependencies": {
"os-tmpdir": "~1.0.2"
},
"engines": {
"node": ">=0.6.0"
}
},
"node_modules/tslib": {
"version": "2.7.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.7.0.tgz",
"integrity": "sha512-gLXCKdN1/j47AiHiOkJN69hJmcbGTHI0ImLmbYLHykhgeN0jVGola9yVjFgzCUklsZQMW55o+dW7IXv3RCXDzA==",
"license": "0BSD"
},
"node_modules/undici-types": {
"version": "6.19.8",
"resolved": "https://registry.npmjs.org/undici-types/-/undici-types-6.19.8.tgz",
"integrity": "sha512-ve2KP6f/JnbPBFyobGHuerC9g1FYGn/F8n1LWTwNxCEzd6IfqTwUQcNXgEtmmQ6DlRrC1hrSrBnCZPokRrDHjw==",
"license": "MIT"
},
"node_modules/ws": {
"version": "8.17.1",
"resolved": "https://registry.npmjs.org/ws/-/ws-8.17.1.tgz",
"integrity": "sha512-6XQFvXTkbfUOZOKKILFG1PDK2NDQs4azKQl26T0YS5CxqWLgXajbPZ+h4gZekJyRqFU8pvnbAbbs/3TgRPy+GQ==",
"license": "MIT",
"engines": {
"node": ">=10.0.0"
},
"peerDependencies": {
"bufferutil": "^4.0.1",
"utf-8-validate": ">=5.0.2"
},
"peerDependenciesMeta": {
"bufferutil": {
"optional": true
},
"utf-8-validate": {
"optional": true
}
}
}
}
}

View File

@ -0,0 +1,14 @@
{
"name": "fusdt-contract",
"version": "1.0.0",
"type": "module",
"description": "Future USDT (fUSDT) ERC-20 Token Contract",
"scripts": {
"compile": "node compile.mjs",
"deploy": "node deploy.mjs"
},
"dependencies": {
"ethers": "^6.9.0",
"solc": "^0.8.19"
}
}

View File

@ -1,7 +1,6 @@
-- ============================================================================ -- ============================================================================
-- contribution-service 初始化 migration -- contribution-service 初始化 migration
-- 合并自: 20260111000000_init, 20260111100000_add_referral_user_ids, -- 合并自: 0001_init, 0002_add_transactional_idempotency, 20250120000001_add_region_to_system_accounts
-- 20260112020000_fix_status_varchar_length, 20260112200000_add_adoption_province_city
-- ============================================================================ -- ============================================================================
-- ============================================ -- ============================================
@ -228,8 +227,9 @@ CREATE INDEX "unallocated_contributions_status_idx" ON "unallocated_contribution
CREATE TABLE "system_accounts" ( CREATE TABLE "system_accounts" (
"id" BIGSERIAL NOT NULL, "id" BIGSERIAL NOT NULL,
"account_type" VARCHAR(20) NOT NULL, "account_type" TEXT NOT NULL,
"name" VARCHAR(100) NOT NULL, "region_code" TEXT,
"name" TEXT NOT NULL,
"contribution_balance" DECIMAL(30,10) NOT NULL DEFAULT 0, "contribution_balance" DECIMAL(30,10) NOT NULL DEFAULT 0,
"contribution_never_expires" BOOLEAN NOT NULL DEFAULT false, "contribution_never_expires" BOOLEAN NOT NULL DEFAULT false,
"version" INTEGER NOT NULL DEFAULT 1, "version" INTEGER NOT NULL DEFAULT 1,
@ -239,18 +239,26 @@ CREATE TABLE "system_accounts" (
CONSTRAINT "system_accounts_pkey" PRIMARY KEY ("id") CONSTRAINT "system_accounts_pkey" PRIMARY KEY ("id")
); );
CREATE UNIQUE INDEX "system_accounts_account_type_key" ON "system_accounts"("account_type"); CREATE UNIQUE INDEX "system_accounts_account_type_region_code_key" ON "system_accounts"("account_type", "region_code");
CREATE INDEX "system_accounts_account_type_idx" ON "system_accounts"("account_type");
CREATE INDEX "system_accounts_region_code_idx" ON "system_accounts"("region_code");
CREATE TABLE "system_contribution_records" ( CREATE TABLE "system_contribution_records" (
"id" BIGSERIAL NOT NULL, "id" BIGSERIAL NOT NULL,
"system_account_id" BIGINT NOT NULL, "system_account_id" BIGINT NOT NULL,
"source_adoption_id" BIGINT NOT NULL, "source_adoption_id" BIGINT NOT NULL,
"source_account_sequence" VARCHAR(20) NOT NULL, "source_account_sequence" VARCHAR(20) NOT NULL,
-- 来源类型: FIXED_RATE(固定比例) / LEVEL_OVERFLOW(层级溢出) / LEVEL_NO_ANCESTOR(无上线) / BONUS_TIER_1/2/3(团队奖励未解锁)
"source_type" VARCHAR(30) NOT NULL,
-- 层级深度1-15仅对 LEVEL_OVERFLOW 和 LEVEL_NO_ANCESTOR 类型有效
"level_depth" INTEGER,
"distribution_rate" DECIMAL(10,6) NOT NULL, "distribution_rate" DECIMAL(10,6) NOT NULL,
"amount" DECIMAL(30,10) NOT NULL, "amount" DECIMAL(30,10) NOT NULL,
"effective_date" DATE NOT NULL, "effective_date" DATE NOT NULL,
"expire_date" DATE, "expire_date" DATE,
"is_expired" BOOLEAN NOT NULL DEFAULT false, "is_expired" BOOLEAN NOT NULL DEFAULT false,
-- 软删除时间戳
"deleted_at" TIMESTAMP(3),
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP, "created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "system_contribution_records_pkey" PRIMARY KEY ("id") CONSTRAINT "system_contribution_records_pkey" PRIMARY KEY ("id")
@ -258,6 +266,8 @@ CREATE TABLE "system_contribution_records" (
CREATE INDEX "system_contribution_records_system_account_id_idx" ON "system_contribution_records"("system_account_id"); CREATE INDEX "system_contribution_records_system_account_id_idx" ON "system_contribution_records"("system_account_id");
CREATE INDEX "system_contribution_records_source_adoption_id_idx" ON "system_contribution_records"("source_adoption_id"); CREATE INDEX "system_contribution_records_source_adoption_id_idx" ON "system_contribution_records"("source_adoption_id");
CREATE INDEX "system_contribution_records_source_type_idx" ON "system_contribution_records"("source_type");
CREATE INDEX "system_contribution_records_deleted_at_idx" ON "system_contribution_records"("deleted_at");
ALTER TABLE "system_contribution_records" ADD CONSTRAINT "system_contribution_records_system_account_id_fkey" FOREIGN KEY ("system_account_id") REFERENCES "system_accounts"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "system_contribution_records" ADD CONSTRAINT "system_contribution_records_system_account_id_fkey" FOREIGN KEY ("system_account_id") REFERENCES "system_accounts"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
@ -327,20 +337,36 @@ CREATE TABLE "cdc_sync_progress" (
CREATE UNIQUE INDEX "cdc_sync_progress_source_topic_key" ON "cdc_sync_progress"("source_topic"); CREATE UNIQUE INDEX "cdc_sync_progress_source_topic_key" ON "cdc_sync_progress"("source_topic");
-- 2.0 服务间 Outbox 事件幂等表
CREATE TABLE "processed_events" ( CREATE TABLE "processed_events" (
"id" BIGSERIAL NOT NULL, "id" BIGSERIAL NOT NULL,
"event_id" VARCHAR(100) NOT NULL, "event_id" VARCHAR(100) NOT NULL,
"event_type" VARCHAR(50) NOT NULL, "event_type" VARCHAR(50) NOT NULL,
"source_service" VARCHAR(50), "source_service" VARCHAR(100) NOT NULL,
"processed_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP, "processed_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "processed_events_pkey" PRIMARY KEY ("id") CONSTRAINT "processed_events_pkey" PRIMARY KEY ("id")
); );
CREATE UNIQUE INDEX "processed_events_event_id_key" ON "processed_events"("event_id"); CREATE UNIQUE INDEX "processed_events_source_service_event_id_key" ON "processed_events"("source_service", "event_id");
CREATE INDEX "processed_events_event_type_idx" ON "processed_events"("event_type"); CREATE INDEX "processed_events_event_type_idx" ON "processed_events"("event_type");
CREATE INDEX "processed_events_processed_at_idx" ON "processed_events"("processed_at"); CREATE INDEX "processed_events_processed_at_idx" ON "processed_events"("processed_at");
-- 1.0 CDC 事件幂等表
CREATE TABLE "processed_cdc_events" (
"id" BIGSERIAL NOT NULL,
"source_topic" VARCHAR(200) NOT NULL,
"offset" BIGINT NOT NULL,
"table_name" VARCHAR(100) NOT NULL,
"operation" VARCHAR(10) NOT NULL,
"processed_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "processed_cdc_events_pkey" PRIMARY KEY ("id")
);
CREATE UNIQUE INDEX "processed_cdc_events_source_topic_offset_key" ON "processed_cdc_events"("source_topic", "offset");
CREATE INDEX "processed_cdc_events_processed_at_idx" ON "processed_cdc_events"("processed_at");
-- ============================================ -- ============================================
-- 9. 配置表 -- 9. 配置表
-- ============================================ -- ============================================

View File

@ -1,45 +0,0 @@
-- ============================================================================
-- 添加事务性幂等消费支持
-- 用于 1.0 -> 2.0 CDC 同步的 100% exactly-once 语义
-- ============================================================================
-- 1. 创建 processed_cdc_events 表(用于 CDC 事件幂等)
-- 唯一键: (source_topic, offset) - Kafka topic 名称 + 消息偏移量
-- 用于保证每个 CDC 事件只处理一次exactly-once 语义)
CREATE TABLE IF NOT EXISTS "processed_cdc_events" (
"id" BIGSERIAL NOT NULL,
"source_topic" VARCHAR(200) NOT NULL, -- Kafka topic 名称(如 cdc.identity.public.user_accounts
"offset" BIGINT NOT NULL, -- Kafka 消息偏移量(在 partition 内唯一)
"table_name" VARCHAR(100) NOT NULL, -- 源表名
"operation" VARCHAR(10) NOT NULL, -- CDC 操作类型: c(create), u(update), d(delete), r(snapshot read)
"processed_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "processed_cdc_events_pkey" PRIMARY KEY ("id")
);
-- 复合唯一索引:(source_topic, offset) 保证幂等性
-- 注意:这不是数据库自增 ID而是 Kafka 消息的唯一标识
CREATE UNIQUE INDEX "processed_cdc_events_source_topic_offset_key" ON "processed_cdc_events"("source_topic", "offset");
-- 时间索引用于清理旧数据
CREATE INDEX "processed_cdc_events_processed_at_idx" ON "processed_cdc_events"("processed_at");
-- 2. 修复 processed_events 表(用于 2.0 服务间 Outbox 事件幂等)
-- 唯一键: (source_service, event_id) - 服务名 + outbox 表的 ID
-- 不同服务的 outbox ID 可能相同,所以需要组合服务名作为复合唯一键
-- 2.1 修改 source_service 列:扩展长度 50->100且设为 NOT NULL
-- 先为已有 NULL 值设置默认值
UPDATE "processed_events" SET "source_service" = 'unknown' WHERE "source_service" IS NULL;
-- 修改列类型和约束
ALTER TABLE "processed_events"
ALTER COLUMN "source_service" SET NOT NULL,
ALTER COLUMN "source_service" TYPE VARCHAR(100);
-- 2.2 删除旧的单字段唯一索引
DROP INDEX IF EXISTS "processed_events_event_id_key";
-- 2.3 创建新的复合唯一索引
-- 索引名使用蛇形命名以与列名保持一致
CREATE UNIQUE INDEX IF NOT EXISTS "processed_events_source_service_event_id_key" ON "processed_events"("source_service", "event_id");

View File

@ -299,9 +299,10 @@ model UnallocatedContribution {
// 系统账户(运营/省/市/总部) // 系统账户(运营/省/市/总部)
model SystemAccount { model SystemAccount {
id BigInt @id @default(autoincrement()) id BigInt @id @default(autoincrement())
accountType String @unique @map("account_type") @db.VarChar(20) // OPERATION / PROVINCE / CITY / HEADQUARTERS accountType String @map("account_type") // OPERATION / PROVINCE / CITY / HEADQUARTERS
name String @db.VarChar(100) regionCode String? @map("region_code") // 省/市代码,如 440000, 440100
name String
contributionBalance Decimal @default(0) @map("contribution_balance") @db.Decimal(30, 10) contributionBalance Decimal @default(0) @map("contribution_balance") @db.Decimal(30, 10)
contributionNeverExpires Boolean @default(false) @map("contribution_never_expires") contributionNeverExpires Boolean @default(false) @map("contribution_never_expires")
@ -313,6 +314,9 @@ model SystemAccount {
records SystemContributionRecord[] records SystemContributionRecord[]
@@unique([accountType, regionCode])
@@index([accountType])
@@index([regionCode])
@@map("system_accounts") @@map("system_accounts")
} }
@ -323,6 +327,11 @@ model SystemContributionRecord {
sourceAdoptionId BigInt @map("source_adoption_id") sourceAdoptionId BigInt @map("source_adoption_id")
sourceAccountSequence String @map("source_account_sequence") @db.VarChar(20) sourceAccountSequence String @map("source_account_sequence") @db.VarChar(20)
// 来源类型FIXED_RATE(固定比例分配) / LEVEL_OVERFLOW(层级溢出) / LEVEL_NO_ANCESTOR(无上线) / BONUS_TIER_1/2/3(团队奖励未解锁)
sourceType String @map("source_type") @db.VarChar(30)
// 层级深度:对于 LEVEL_OVERFLOW 和 LEVEL_NO_ANCESTOR 类型表示第几级1-15
levelDepth Int? @map("level_depth")
distributionRate Decimal @map("distribution_rate") @db.Decimal(10, 6) distributionRate Decimal @map("distribution_rate") @db.Decimal(10, 6)
amount Decimal @map("amount") @db.Decimal(30, 10) amount Decimal @map("amount") @db.Decimal(30, 10)
@ -330,12 +339,15 @@ model SystemContributionRecord {
expireDate DateTime? @map("expire_date") @db.Date expireDate DateTime? @map("expire_date") @db.Date
isExpired Boolean @default(false) @map("is_expired") isExpired Boolean @default(false) @map("is_expired")
createdAt DateTime @default(now()) @map("created_at") createdAt DateTime @default(now()) @map("created_at")
deletedAt DateTime? @map("deleted_at") // 软删除标记
systemAccount SystemAccount @relation(fields: [systemAccountId], references: [id]) systemAccount SystemAccount @relation(fields: [systemAccountId], references: [id])
@@index([systemAccountId]) @@index([systemAccountId])
@@index([sourceAdoptionId]) @@index([sourceAdoptionId])
@@index([deletedAt])
@@index([sourceType])
@@map("system_contribution_records") @@map("system_contribution_records")
} }

View File

@ -10,6 +10,8 @@ import {
AdoptionSyncedEvent, AdoptionSyncedEvent,
ContributionRecordSyncedEvent, ContributionRecordSyncedEvent,
NetworkProgressUpdatedEvent, NetworkProgressUpdatedEvent,
SystemAccountSyncedEvent,
UnallocatedContributionSyncedEvent,
} from '../../domain/events'; } from '../../domain/events';
import { Public } from '../../shared/guards/jwt-auth.guard'; import { Public } from '../../shared/guards/jwt-auth.guard';
@ -420,4 +422,190 @@ export class AdminController {
}; };
} }
} }
@Post('system-accounts/publish-all')
@Public()
@ApiOperation({ summary: '发布所有系统账户算力事件到 outbox用于同步到 mining-service' })
async publishAllSystemAccounts(): Promise<{
success: boolean;
publishedCount: number;
message: string;
}> {
try {
const systemAccounts = await this.prisma.systemAccount.findMany();
await this.unitOfWork.executeInTransaction(async () => {
const events = systemAccounts.map((account) => {
const event = new SystemAccountSyncedEvent(
account.accountType,
account.regionCode,
account.name,
account.contributionBalance.toString(),
account.createdAt,
);
return {
aggregateType: SystemAccountSyncedEvent.AGGREGATE_TYPE,
aggregateId: `${account.accountType}:${account.regionCode || 'null'}`,
eventType: SystemAccountSyncedEvent.EVENT_TYPE,
payload: event.toPayload(),
};
});
await this.outboxRepository.saveMany(events);
});
this.logger.log(`Published ${systemAccounts.length} system account events`);
return {
success: true,
publishedCount: systemAccounts.length,
message: `Published ${systemAccounts.length} system account events`,
};
} catch (error) {
this.logger.error('Failed to publish system accounts', error);
return {
success: false,
publishedCount: 0,
message: `Failed: ${error.message}`,
};
}
}
@Get('system-accounts')
@Public()
@ApiOperation({ summary: '获取所有系统账户算力' })
async getSystemAccounts() {
const systemAccounts = await this.prisma.systemAccount.findMany();
return {
accounts: systemAccounts.map((a) => ({
accountType: a.accountType,
name: a.name,
contributionBalance: a.contributionBalance.toString(),
createdAt: a.createdAt,
updatedAt: a.updatedAt,
})),
total: systemAccounts.length,
};
}
@Get('unallocated-contributions')
@Public()
@ApiOperation({ summary: '获取所有未分配算力列表,供 mining-service 定时同步' })
async getUnallocatedContributions(): Promise<{
contributions: Array<{
sourceAdoptionId: string;
sourceAccountSequence: string;
wouldBeAccountSequence: string | null;
contributionType: string;
amount: string;
reason: string | null;
effectiveDate: string;
expireDate: string;
}>;
total: number;
}> {
const unallocatedContributions = await this.prisma.unallocatedContribution.findMany({
where: { status: 'PENDING' },
select: {
sourceAdoptionId: true,
sourceAccountSequence: true,
wouldBeAccountSequence: true,
unallocType: true,
amount: true,
reason: true,
effectiveDate: true,
expireDate: true,
},
});
return {
contributions: unallocatedContributions.map((uc) => ({
sourceAdoptionId: uc.sourceAdoptionId.toString(),
sourceAccountSequence: uc.sourceAccountSequence,
wouldBeAccountSequence: uc.wouldBeAccountSequence,
contributionType: uc.unallocType,
amount: uc.amount.toString(),
reason: uc.reason,
effectiveDate: uc.effectiveDate.toISOString(),
expireDate: uc.expireDate.toISOString(),
})),
total: unallocatedContributions.length,
};
}
@Post('unallocated-contributions/publish-all')
@Public()
@ApiOperation({ summary: '发布所有未分配算力事件到 outbox用于同步到 mining-service' })
async publishAllUnallocatedContributions(): Promise<{
success: boolean;
publishedCount: number;
failedCount: number;
message: string;
}> {
const unallocatedContributions = await this.prisma.unallocatedContribution.findMany({
where: { status: 'PENDING' },
select: {
id: true,
sourceAdoptionId: true,
sourceAccountSequence: true,
wouldBeAccountSequence: true,
unallocType: true,
amount: true,
reason: true,
effectiveDate: true,
expireDate: true,
},
});
let publishedCount = 0;
let failedCount = 0;
const batchSize = 100;
for (let i = 0; i < unallocatedContributions.length; i += batchSize) {
const batch = unallocatedContributions.slice(i, i + batchSize);
try {
await this.unitOfWork.executeInTransaction(async () => {
const events = batch.map((uc) => {
const event = new UnallocatedContributionSyncedEvent(
uc.sourceAdoptionId,
uc.sourceAccountSequence,
uc.wouldBeAccountSequence,
uc.unallocType,
uc.amount.toString(),
uc.reason,
uc.effectiveDate,
uc.expireDate,
);
return {
aggregateType: UnallocatedContributionSyncedEvent.AGGREGATE_TYPE,
aggregateId: `${uc.sourceAdoptionId}-${uc.unallocType}`,
eventType: UnallocatedContributionSyncedEvent.EVENT_TYPE,
payload: event.toPayload(),
};
});
await this.outboxRepository.saveMany(events);
});
publishedCount += batch.length;
this.logger.debug(`Published unallocated contribution batch ${Math.floor(i / batchSize) + 1}: ${batch.length} events`);
} catch (error) {
failedCount += batch.length;
this.logger.error(`Failed to publish unallocated contribution batch ${Math.floor(i / batchSize) + 1}`, error);
}
}
this.logger.log(`Published ${publishedCount} unallocated contribution events, ${failedCount} failed`);
return {
success: failedCount === 0,
publishedCount,
failedCount,
message: `Published ${publishedCount} events, ${failedCount} failed out of ${unallocatedContributions.length} total`,
};
}
} }

View File

@ -1,8 +1,10 @@
import { Controller, Get, Param, Query, NotFoundException } from '@nestjs/common'; import { Controller, Get, Param, Query, NotFoundException } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiResponse, ApiParam } from '@nestjs/swagger'; import { ApiTags, ApiOperation, ApiResponse, ApiParam, ApiQuery } from '@nestjs/swagger';
import { GetContributionAccountQuery } from '../../application/queries/get-contribution-account.query'; import { GetContributionAccountQuery } from '../../application/queries/get-contribution-account.query';
import { GetContributionStatsQuery } from '../../application/queries/get-contribution-stats.query'; import { GetContributionStatsQuery } from '../../application/queries/get-contribution-stats.query';
import { GetContributionRankingQuery } from '../../application/queries/get-contribution-ranking.query'; import { GetContributionRankingQuery } from '../../application/queries/get-contribution-ranking.query';
import { GetPlantingLedgerQuery, PlantingLedgerDto } from '../../application/queries/get-planting-ledger.query';
import { GetTeamTreeQuery, DirectReferralsResponseDto, MyTeamInfoDto } from '../../application/queries/get-team-tree.query';
import { import {
ContributionAccountResponse, ContributionAccountResponse,
ContributionRecordsResponse, ContributionRecordsResponse,
@ -11,6 +13,7 @@ import {
import { ContributionStatsResponse } from '../dto/response/contribution-stats.response'; import { ContributionStatsResponse } from '../dto/response/contribution-stats.response';
import { ContributionRankingResponse, UserRankResponse } from '../dto/response/contribution-ranking.response'; import { ContributionRankingResponse, UserRankResponse } from '../dto/response/contribution-ranking.response';
import { GetContributionRecordsRequest } from '../dto/request/get-records.request'; import { GetContributionRecordsRequest } from '../dto/request/get-records.request';
import { Public } from '../../shared/guards/jwt-auth.guard';
@ApiTags('Contribution') @ApiTags('Contribution')
@Controller('contribution') @Controller('contribution')
@ -19,9 +22,12 @@ export class ContributionController {
private readonly getAccountQuery: GetContributionAccountQuery, private readonly getAccountQuery: GetContributionAccountQuery,
private readonly getStatsQuery: GetContributionStatsQuery, private readonly getStatsQuery: GetContributionStatsQuery,
private readonly getRankingQuery: GetContributionRankingQuery, private readonly getRankingQuery: GetContributionRankingQuery,
private readonly getPlantingLedgerQuery: GetPlantingLedgerQuery,
private readonly getTeamTreeQuery: GetTeamTreeQuery,
) {} ) {}
@Get('stats') @Get('stats')
@Public()
@ApiOperation({ summary: '获取算力统计数据' }) @ApiOperation({ summary: '获取算力统计数据' })
@ApiResponse({ status: 200, type: ContributionStatsResponse }) @ApiResponse({ status: 200, type: ContributionStatsResponse })
async getStats(): Promise<ContributionStatsResponse> { async getStats(): Promise<ContributionStatsResponse> {
@ -95,4 +101,52 @@ export class ContributionController {
} }
return result; return result;
} }
@Get('accounts/:accountSequence/planting-ledger')
@ApiOperation({ summary: '获取账户认种分类账' })
@ApiParam({ name: 'accountSequence', description: '账户序号' })
@ApiQuery({ name: 'page', required: false, type: Number, description: '页码' })
@ApiQuery({ name: 'pageSize', required: false, type: Number, description: '每页数量' })
@ApiResponse({ status: 200, description: '认种分类账' })
async getPlantingLedger(
@Param('accountSequence') accountSequence: string,
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
): Promise<PlantingLedgerDto> {
return this.getPlantingLedgerQuery.execute(
accountSequence,
page ?? 1,
pageSize ?? 20,
);
}
// ========== 团队树 API ==========
@Get('accounts/:accountSequence/team')
@ApiOperation({ summary: '获取账户团队信息' })
@ApiParam({ name: 'accountSequence', description: '账户序号' })
@ApiResponse({ status: 200, description: '团队信息' })
async getMyTeamInfo(
@Param('accountSequence') accountSequence: string,
): Promise<MyTeamInfoDto> {
return this.getTeamTreeQuery.getMyTeamInfo(accountSequence);
}
@Get('accounts/:accountSequence/team/direct-referrals')
@ApiOperation({ summary: '获取账户直推列表(用于伞下树懒加载)' })
@ApiParam({ name: 'accountSequence', description: '账户序号' })
@ApiQuery({ name: 'limit', required: false, type: Number, description: '每页数量' })
@ApiQuery({ name: 'offset', required: false, type: Number, description: '偏移量' })
@ApiResponse({ status: 200, description: '直推列表' })
async getDirectReferrals(
@Param('accountSequence') accountSequence: string,
@Query('limit') limit?: number,
@Query('offset') offset?: number,
): Promise<DirectReferralsResponseDto> {
return this.getTeamTreeQuery.getDirectReferrals(
accountSequence,
limit ?? 100,
offset ?? 0,
);
}
} }

View File

@ -2,6 +2,7 @@ import { Controller, Get } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiResponse } from '@nestjs/swagger'; import { ApiTags, ApiOperation, ApiResponse } from '@nestjs/swagger';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service'; import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
import { RedisService } from '../../infrastructure/redis/redis.service'; import { RedisService } from '../../infrastructure/redis/redis.service';
import { CDCConsumerService } from '../../infrastructure/kafka/cdc-consumer.service';
import { Public } from '../../shared/guards/jwt-auth.guard'; import { Public } from '../../shared/guards/jwt-auth.guard';
interface HealthStatus { interface HealthStatus {
@ -20,6 +21,7 @@ export class HealthController {
constructor( constructor(
private readonly prisma: PrismaService, private readonly prisma: PrismaService,
private readonly redis: RedisService, private readonly redis: RedisService,
private readonly cdcConsumer: CDCConsumerService,
) {} ) {}
@Get() @Get()
@ -68,4 +70,15 @@ export class HealthController {
async live(): Promise<{ alive: boolean }> { async live(): Promise<{ alive: boolean }> {
return { alive: true }; return { alive: true };
} }
@Get('cdc-sync')
@ApiOperation({ summary: 'CDC 同步状态检查' })
@ApiResponse({ status: 200, description: 'CDC 同步状态' })
async cdcSyncStatus(): Promise<{
isRunning: boolean;
sequentialMode: boolean;
allPhasesCompleted: boolean;
}> {
return this.cdcConsumer.getSyncStatus();
}
} }

View File

@ -16,6 +16,7 @@ import { JwtAuthGuard } from './shared/guards/jwt-auth.guard';
envFilePath: [ envFilePath: [
`.env.${process.env.NODE_ENV || 'development'}`, `.env.${process.env.NODE_ENV || 'development'}`,
'.env', '.env',
'../.env', // 父目录共享 .env
], ],
ignoreEnvFile: false, ignoreEnvFile: false,
}), }),

View File

@ -12,12 +12,15 @@ import { CDCEventDispatcher } from './event-handlers/cdc-event-dispatcher';
import { ContributionCalculationService } from './services/contribution-calculation.service'; import { ContributionCalculationService } from './services/contribution-calculation.service';
import { ContributionDistributionPublisherService } from './services/contribution-distribution-publisher.service'; import { ContributionDistributionPublisherService } from './services/contribution-distribution-publisher.service';
import { ContributionRateService } from './services/contribution-rate.service'; import { ContributionRateService } from './services/contribution-rate.service';
import { BonusClaimService } from './services/bonus-claim.service';
import { SnapshotService } from './services/snapshot.service'; import { SnapshotService } from './services/snapshot.service';
// Queries // Queries
import { GetContributionAccountQuery } from './queries/get-contribution-account.query'; import { GetContributionAccountQuery } from './queries/get-contribution-account.query';
import { GetContributionStatsQuery } from './queries/get-contribution-stats.query'; import { GetContributionStatsQuery } from './queries/get-contribution-stats.query';
import { GetContributionRankingQuery } from './queries/get-contribution-ranking.query'; import { GetContributionRankingQuery } from './queries/get-contribution-ranking.query';
import { GetPlantingLedgerQuery } from './queries/get-planting-ledger.query';
import { GetTeamTreeQuery } from './queries/get-team-tree.query';
// Schedulers // Schedulers
import { ContributionScheduler } from './schedulers/contribution.scheduler'; import { ContributionScheduler } from './schedulers/contribution.scheduler';
@ -38,12 +41,15 @@ import { ContributionScheduler } from './schedulers/contribution.scheduler';
ContributionCalculationService, ContributionCalculationService,
ContributionDistributionPublisherService, ContributionDistributionPublisherService,
ContributionRateService, ContributionRateService,
BonusClaimService,
SnapshotService, SnapshotService,
// Queries // Queries
GetContributionAccountQuery, GetContributionAccountQuery,
GetContributionStatsQuery, GetContributionStatsQuery,
GetContributionRankingQuery, GetContributionRankingQuery,
GetPlantingLedgerQuery,
GetTeamTreeQuery,
// Schedulers // Schedulers
ContributionScheduler, ContributionScheduler,
@ -55,6 +61,8 @@ import { ContributionScheduler } from './schedulers/contribution.scheduler';
GetContributionAccountQuery, GetContributionAccountQuery,
GetContributionStatsQuery, GetContributionStatsQuery,
GetContributionRankingQuery, GetContributionRankingQuery,
GetPlantingLedgerQuery,
GetTeamTreeQuery,
], ],
}) })
export class ApplicationModule {} export class ApplicationModule {}

View File

@ -2,6 +2,7 @@ import { Injectable, Logger } from '@nestjs/common';
import Decimal from 'decimal.js'; import Decimal from 'decimal.js';
import { CDCEvent, TransactionClient } from '../../infrastructure/kafka/cdc-consumer.service'; import { CDCEvent, TransactionClient } from '../../infrastructure/kafka/cdc-consumer.service';
import { ContributionCalculationService } from '../services/contribution-calculation.service'; import { ContributionCalculationService } from '../services/contribution-calculation.service';
import { ContributionRateService } from '../services/contribution-rate.service';
/** /**
* *
@ -15,19 +16,11 @@ export interface AdoptionSyncResult {
* CDC * CDC
* 1.0 planting-service同步过来的planting_orders数据 * 1.0 planting-service同步过来的planting_orders数据
* *
* *
* =========================================== * ===========================================
* - handle() synced_adoptions * - handle() 100%
* - AdoptionSyncResultID * - status MINING_ENABLED
* - calculateForAdoption * - Serializable
*
* calculateForAdoption
* 1. calculateForAdoption 使
* 2. Serializable
* 3. "Adoption not found" synced_adoptions
*
* Kafka Idempotent Consumer & Transactional Outbox Pattern
* https://www.lydtechconsulting.com/blog/kafka-idempotent-consumer-transactional-outbox
*/ */
@Injectable() @Injectable()
export class AdoptionSyncedHandler { export class AdoptionSyncedHandler {
@ -35,6 +28,7 @@ export class AdoptionSyncedHandler {
constructor( constructor(
private readonly contributionCalculationService: ContributionCalculationService, private readonly contributionCalculationService: ContributionCalculationService,
private readonly contributionRateService: ContributionRateService,
) {} ) {}
/** /**
@ -48,13 +42,28 @@ export class AdoptionSyncedHandler {
this.logger.log(`[CDC] Adoption event received: op=${op}, seq=${event.sequenceNum}`); this.logger.log(`[CDC] Adoption event received: op=${op}, seq=${event.sequenceNum}`);
this.logger.debug(`[CDC] Adoption event payload: ${JSON.stringify(after || before)}`); this.logger.debug(`[CDC] Adoption event payload: ${JSON.stringify(after || before)}`);
// 获取认种日期,用于查询当日贡献值
const data = after || before;
const adoptionDate = data?.created_at || data?.createdAt || data?.paid_at || data?.paidAt;
// 在事务外获取当日每棵树的贡献值
let contributionPerTree = new Decimal('22617'); // 默认值
if (adoptionDate) {
try {
contributionPerTree = await this.contributionRateService.getContributionPerTree(new Date(adoptionDate));
this.logger.log(`[CDC] Got contributionPerTree for ${adoptionDate}: ${contributionPerTree.toString()}`);
} catch (error) {
this.logger.warn(`[CDC] Failed to get contributionPerTree, using default 22617`, error);
}
}
try { try {
switch (op) { switch (op) {
case 'c': // create case 'c': // create
case 'r': // read (snapshot) case 'r': // read (snapshot)
return await this.handleCreate(after, event.sequenceNum, tx); return await this.handleCreate(after, event.sequenceNum, tx, contributionPerTree);
case 'u': // update case 'u': // update
return await this.handleUpdate(after, before, event.sequenceNum, tx); return await this.handleUpdate(after, before, event.sequenceNum, tx, contributionPerTree);
case 'd': // delete case 'd': // delete
await this.handleDelete(before); await this.handleDelete(before);
return null; return null;
@ -86,21 +95,21 @@ export class AdoptionSyncedHandler {
} }
} }
private async handleCreate(data: any, sequenceNum: bigint, tx: TransactionClient): Promise<AdoptionSyncResult | null> { private async handleCreate(data: any, sequenceNum: bigint, tx: TransactionClient, contributionPerTree: Decimal): Promise<AdoptionSyncResult | null> {
if (!data) { if (!data) {
this.logger.warn(`[CDC] Adoption create: empty data received`); this.logger.warn(`[CDC] Adoption create: empty data received`);
return null; return null;
} }
// planting_orders表字段: order_id, account_sequence, tree_count, created_at, status, selected_province, selected_city
const orderId = data.order_id || data.id; const orderId = data.order_id || data.id;
const accountSequence = data.account_sequence || data.accountSequence; const accountSequence = data.account_sequence || data.accountSequence;
const treeCount = data.tree_count || data.treeCount; const treeCount = data.tree_count || data.treeCount;
const createdAt = data.created_at || data.createdAt || data.paid_at || data.paidAt; const createdAt = data.created_at || data.createdAt || data.paid_at || data.paidAt;
const selectedProvince = data.selected_province || data.selectedProvince || null; const selectedProvince = data.selected_province || data.selectedProvince || null;
const selectedCity = data.selected_city || data.selectedCity || null; const selectedCity = data.selected_city || data.selectedCity || null;
const status = data.status ?? null;
this.logger.log(`[CDC] Adoption create: orderId=${orderId}, account=${accountSequence}, trees=${treeCount}, province=${selectedProvince}, city=${selectedCity}`); this.logger.log(`[CDC] Adoption create: orderId=${orderId}, account=${accountSequence}, trees=${treeCount}, status=${status}, contributionPerTree=${contributionPerTree.toString()}`);
if (!orderId || !accountSequence) { if (!orderId || !accountSequence) {
this.logger.warn(`[CDC] Invalid adoption data: missing order_id or account_sequence`, { data }); this.logger.warn(`[CDC] Invalid adoption data: missing order_id or account_sequence`, { data });
@ -109,8 +118,7 @@ export class AdoptionSyncedHandler {
const originalAdoptionId = BigInt(orderId); const originalAdoptionId = BigInt(orderId);
// 在事务中保存同步的认种订单数据 // 100%同步数据,使用真实的每棵树贡献值
this.logger.log(`[CDC] Upserting synced adoption: ${orderId}`);
await tx.syncedAdoption.upsert({ await tx.syncedAdoption.upsert({
where: { originalAdoptionId }, where: { originalAdoptionId },
create: { create: {
@ -118,10 +126,10 @@ export class AdoptionSyncedHandler {
accountSequence, accountSequence,
treeCount, treeCount,
adoptionDate: new Date(createdAt), adoptionDate: new Date(createdAt),
status: data.status ?? null, status,
selectedProvince, selectedProvince,
selectedCity, selectedCity,
contributionPerTree: new Decimal('1'), // 每棵树1算力 contributionPerTree,
sourceSequenceNum: sequenceNum, sourceSequenceNum: sequenceNum,
syncedAt: new Date(), syncedAt: new Date(),
}, },
@ -129,25 +137,26 @@ export class AdoptionSyncedHandler {
accountSequence, accountSequence,
treeCount, treeCount,
adoptionDate: new Date(createdAt), adoptionDate: new Date(createdAt),
status: data.status ?? undefined, status,
selectedProvince: selectedProvince ?? undefined, selectedProvince,
selectedCity: selectedCity ?? undefined, selectedCity,
contributionPerTree: new Decimal('1'), contributionPerTree,
sourceSequenceNum: sequenceNum, sourceSequenceNum: sequenceNum,
syncedAt: new Date(), syncedAt: new Date(),
}, },
}); });
this.logger.log(`[CDC] Adoption synced successfully: orderId=${orderId}, account=${accountSequence}, trees=${treeCount}`); this.logger.log(`[CDC] Adoption synced: orderId=${orderId}, status=${status}`);
// 返回结果,供事务提交后计算算力 // 只有 MINING_ENABLED 状态才触发算力计算
const needsCalculation = status === 'MINING_ENABLED';
return { return {
originalAdoptionId, originalAdoptionId,
needsCalculation: true, needsCalculation,
}; };
} }
private async handleUpdate(after: any, before: any, sequenceNum: bigint, tx: TransactionClient): Promise<AdoptionSyncResult | null> { private async handleUpdate(after: any, before: any, sequenceNum: bigint, tx: TransactionClient, contributionPerTree: Decimal): Promise<AdoptionSyncResult | null> {
if (!after) { if (!after) {
this.logger.warn(`[CDC] Adoption update: empty after data received`); this.logger.warn(`[CDC] Adoption update: empty after data received`);
return null; return null;
@ -155,37 +164,22 @@ export class AdoptionSyncedHandler {
const orderId = after.order_id || after.id; const orderId = after.order_id || after.id;
const originalAdoptionId = BigInt(orderId); const originalAdoptionId = BigInt(orderId);
this.logger.log(`[CDC] Adoption update: orderId=${orderId}`);
// 检查是否已经处理过(使用事务客户端)
const existingAdoption = await tx.syncedAdoption.findUnique({
where: { originalAdoptionId },
});
if (existingAdoption?.contributionDistributed) {
// 如果树数量发生变化,需要重新计算(这种情况较少)
const newTreeCount = after.tree_count || after.treeCount;
if (existingAdoption.treeCount !== newTreeCount) {
this.logger.warn(
`[CDC] Adoption tree count changed after processing: ${originalAdoptionId}, old=${existingAdoption.treeCount}, new=${newTreeCount}. This requires special handling.`,
);
// TODO: 实现树数量变化的处理逻辑
} else {
this.logger.debug(`[CDC] Adoption ${orderId} already distributed, skipping update`);
}
return null;
}
const accountSequence = after.account_sequence || after.accountSequence; const accountSequence = after.account_sequence || after.accountSequence;
const treeCount = after.tree_count || after.treeCount; const treeCount = after.tree_count || after.treeCount;
const createdAt = after.created_at || after.createdAt || after.paid_at || after.paidAt; const createdAt = after.created_at || after.createdAt || after.paid_at || after.paidAt;
const selectedProvince = after.selected_province || after.selectedProvince || null; const selectedProvince = after.selected_province || after.selectedProvince || null;
const selectedCity = after.selected_city || after.selectedCity || null; const selectedCity = after.selected_city || after.selectedCity || null;
const newStatus = after.status ?? null;
const oldStatus = before?.status ?? null;
this.logger.log(`[CDC] Adoption update data: account=${accountSequence}, trees=${treeCount}, province=${selectedProvince}, city=${selectedCity}`); this.logger.log(`[CDC] Adoption update: orderId=${orderId}, status=${oldStatus} -> ${newStatus}, contributionPerTree=${contributionPerTree.toString()}`);
// 在事务中保存同步的认种订单数据 // 查询现有记录
const existingAdoption = await tx.syncedAdoption.findUnique({
where: { originalAdoptionId },
});
// 100%同步数据,使用真实的每棵树贡献值
await tx.syncedAdoption.upsert({ await tx.syncedAdoption.upsert({
where: { originalAdoptionId }, where: { originalAdoptionId },
create: { create: {
@ -193,10 +187,10 @@ export class AdoptionSyncedHandler {
accountSequence, accountSequence,
treeCount, treeCount,
adoptionDate: new Date(createdAt), adoptionDate: new Date(createdAt),
status: after.status ?? null, status: newStatus,
selectedProvince, selectedProvince,
selectedCity, selectedCity,
contributionPerTree: new Decimal('1'), contributionPerTree,
sourceSequenceNum: sequenceNum, sourceSequenceNum: sequenceNum,
syncedAt: new Date(), syncedAt: new Date(),
}, },
@ -204,21 +198,24 @@ export class AdoptionSyncedHandler {
accountSequence, accountSequence,
treeCount, treeCount,
adoptionDate: new Date(createdAt), adoptionDate: new Date(createdAt),
status: after.status ?? undefined, status: newStatus,
selectedProvince: selectedProvince ?? undefined, selectedProvince,
selectedCity: selectedCity ?? undefined, selectedCity,
contributionPerTree: new Decimal('1'), contributionPerTree,
sourceSequenceNum: sequenceNum, sourceSequenceNum: sequenceNum,
syncedAt: new Date(), syncedAt: new Date(),
}, },
}); });
this.logger.log(`[CDC] Adoption updated successfully: ${originalAdoptionId}`); this.logger.log(`[CDC] Adoption synced: orderId=${orderId}, status=${newStatus}`);
// 只有当 status 变为 MINING_ENABLED 且尚未计算过算力时,才触发算力计算
const statusChangedToMiningEnabled = newStatus === 'MINING_ENABLED' && oldStatus !== 'MINING_ENABLED';
const needsCalculation = statusChangedToMiningEnabled && !existingAdoption?.contributionDistributed;
// 只有尚未分配算力的认种才需要计算
return { return {
originalAdoptionId, originalAdoptionId,
needsCalculation: !existingAdoption?.contributionDistributed, needsCalculation,
}; };
} }

View File

@ -51,14 +51,17 @@ export class CDCEventDispatcher implements OnModuleInit {
this.handleAdoptionPostCommit.bind(this), this.handleAdoptionPostCommit.bind(this),
); );
// 启动 CDC 消费者 // 非阻塞启动 CDC 消费者
try { // 让 HTTP 服务器先启动CDC 同步在后台进行
await this.cdcConsumer.start(); // 脚本通过 /health/cdc-sync API 轮询同步状态
this.logger.log('CDC event dispatcher started with transactional idempotency'); this.cdcConsumer.start()
} catch (error) { .then(() => {
this.logger.error('Failed to start CDC event dispatcher', error); this.logger.log('CDC event dispatcher started with transactional idempotency');
// 不抛出错误,允许服务在没有 Kafka 的情况下启动(用于本地开发) })
} .catch((error) => {
this.logger.error('Failed to start CDC event dispatcher', error);
// 不抛出错误,允许服务在没有 Kafka 的情况下启动(用于本地开发)
});
} }
private async handleUserEvent(event: CDCEvent, tx: TransactionClient): Promise<void> { private async handleUserEvent(event: CDCEvent, tx: TransactionClient): Promise<void> {

View File

@ -5,22 +5,7 @@ import { CDCEvent, TransactionClient } from '../../infrastructure/kafka/cdc-cons
* CDC * CDC
* 1.0 referral-service同步过来的referral_relationships数据 * 1.0 referral-service同步过来的referral_relationships数据
* *
* 1.0 (referral_relationships): * 100%
* - user_id: BigInt (ID)
* - account_sequence: String ()
* - referrer_id: BigInt (ID, account_sequence)
* - ancestor_path: BigInt[] ( user_id)
* - depth: Int ()
*
* 2.0 :
* - original_user_id (1.0 user_id)
* - referrer_user_id (1.0 referrer_id)
* - referrer account_sequence
* - ancestor_path
*
* handler tx
* 使
*
*/ */
@Injectable() @Injectable()
export class ReferralSyncedHandler { export class ReferralSyncedHandler {
@ -61,12 +46,11 @@ export class ReferralSyncedHandler {
return; return;
} }
// 1.0 字段映射
const accountSequence = data.account_sequence || data.accountSequence; const accountSequence = data.account_sequence || data.accountSequence;
const originalUserId = data.user_id || data.userId; const originalUserId = data.user_id || data.userId;
const referrerUserId = data.referrer_id || data.referrerId; const referrerUserId = data.referrer_id || data.referrerId;
const ancestorPathArray = data.ancestor_path || data.ancestorPath; const ancestorPathArray = data.ancestor_path || data.ancestorPath;
const depth = data.depth || 0; const depth = data.depth ?? 0;
this.logger.log(`[CDC] Referral create: account=${accountSequence}, userId=${originalUserId}, referrerId=${referrerUserId}, depth=${depth}`); this.logger.log(`[CDC] Referral create: account=${accountSequence}, userId=${originalUserId}, referrerId=${referrerUserId}, depth=${depth}`);
@ -75,11 +59,9 @@ export class ReferralSyncedHandler {
return; return;
} }
// 将 BigInt[] 转换为逗号分隔的字符串
const ancestorPath = this.convertAncestorPath(ancestorPathArray); const ancestorPath = this.convertAncestorPath(ancestorPathArray);
this.logger.debug(`[CDC] Referral ancestorPath converted: ${ancestorPath}`);
// 尝试查找推荐人的 account_sequence(使用事务客户端) // 尝试查找推荐人的 account_sequence
let referrerAccountSequence: string | null = null; let referrerAccountSequence: string | null = null;
if (referrerUserId) { if (referrerUserId) {
const referrer = await tx.syncedReferral.findFirst({ const referrer = await tx.syncedReferral.findFirst({
@ -87,14 +69,10 @@ export class ReferralSyncedHandler {
}); });
if (referrer) { if (referrer) {
referrerAccountSequence = referrer.accountSequence; referrerAccountSequence = referrer.accountSequence;
this.logger.debug(`[CDC] Found referrer account_sequence: ${referrerAccountSequence} for referrer_id: ${referrerUserId}`);
} else {
this.logger.log(`[CDC] Referrer user_id ${referrerUserId} not found yet for ${accountSequence}, will resolve later`);
} }
} }
// 使用外部事务客户端执行所有操作 // 100%同步数据
this.logger.log(`[CDC] Upserting synced referral: ${accountSequence}`);
await tx.syncedReferral.upsert({ await tx.syncedReferral.upsert({
where: { accountSequence }, where: { accountSequence },
create: { create: {
@ -108,17 +86,17 @@ export class ReferralSyncedHandler {
syncedAt: new Date(), syncedAt: new Date(),
}, },
update: { update: {
referrerAccountSequence: referrerAccountSequence ?? undefined, referrerAccountSequence,
referrerUserId: referrerUserId ? BigInt(referrerUserId) : undefined, referrerUserId: referrerUserId ? BigInt(referrerUserId) : null,
originalUserId: originalUserId ? BigInt(originalUserId) : undefined, originalUserId: originalUserId ? BigInt(originalUserId) : null,
ancestorPath: ancestorPath ?? undefined, ancestorPath,
depth: depth ?? undefined, depth,
sourceSequenceNum: sequenceNum, sourceSequenceNum: sequenceNum,
syncedAt: new Date(), syncedAt: new Date(),
}, },
}); });
this.logger.log(`[CDC] Referral synced successfully: ${accountSequence} (user_id: ${originalUserId}) -> referrer_id: ${referrerUserId || 'none'}, depth: ${depth}`); this.logger.log(`[CDC] Referral synced: ${accountSequence}, referrerId=${referrerUserId || 'none'}, depth=${depth}`);
} }
private async handleUpdate(data: any, sequenceNum: bigint, tx: TransactionClient): Promise<void> { private async handleUpdate(data: any, sequenceNum: bigint, tx: TransactionClient): Promise<void> {
@ -131,7 +109,7 @@ export class ReferralSyncedHandler {
const originalUserId = data.user_id || data.userId; const originalUserId = data.user_id || data.userId;
const referrerUserId = data.referrer_id || data.referrerId; const referrerUserId = data.referrer_id || data.referrerId;
const ancestorPathArray = data.ancestor_path || data.ancestorPath; const ancestorPathArray = data.ancestor_path || data.ancestorPath;
const depth = data.depth || 0; const depth = data.depth ?? 0;
this.logger.log(`[CDC] Referral update: account=${accountSequence}, referrerId=${referrerUserId}, depth=${depth}`); this.logger.log(`[CDC] Referral update: account=${accountSequence}, referrerId=${referrerUserId}, depth=${depth}`);
@ -142,7 +120,7 @@ export class ReferralSyncedHandler {
const ancestorPath = this.convertAncestorPath(ancestorPathArray); const ancestorPath = this.convertAncestorPath(ancestorPathArray);
// 尝试查找推荐人的 account_sequence(使用事务客户端) // 尝试查找推荐人的 account_sequence
let referrerAccountSequence: string | null = null; let referrerAccountSequence: string | null = null;
if (referrerUserId) { if (referrerUserId) {
const referrer = await tx.syncedReferral.findFirst({ const referrer = await tx.syncedReferral.findFirst({
@ -150,10 +128,10 @@ export class ReferralSyncedHandler {
}); });
if (referrer) { if (referrer) {
referrerAccountSequence = referrer.accountSequence; referrerAccountSequence = referrer.accountSequence;
this.logger.debug(`[CDC] Found referrer account_sequence: ${referrerAccountSequence}`);
} }
} }
// 100%同步数据
await tx.syncedReferral.upsert({ await tx.syncedReferral.upsert({
where: { accountSequence }, where: { accountSequence },
create: { create: {
@ -167,17 +145,17 @@ export class ReferralSyncedHandler {
syncedAt: new Date(), syncedAt: new Date(),
}, },
update: { update: {
referrerAccountSequence: referrerAccountSequence ?? undefined, referrerAccountSequence,
referrerUserId: referrerUserId ? BigInt(referrerUserId) : undefined, referrerUserId: referrerUserId ? BigInt(referrerUserId) : null,
originalUserId: originalUserId ? BigInt(originalUserId) : undefined, originalUserId: originalUserId ? BigInt(originalUserId) : null,
ancestorPath: ancestorPath ?? undefined, ancestorPath,
depth: depth ?? undefined, depth,
sourceSequenceNum: sequenceNum, sourceSequenceNum: sequenceNum,
syncedAt: new Date(), syncedAt: new Date(),
}, },
}); });
this.logger.log(`[CDC] Referral updated successfully: ${accountSequence}`); this.logger.log(`[CDC] Referral synced: ${accountSequence}`);
} }
private async handleDelete(data: any): Promise<void> { private async handleDelete(data: any): Promise<void> {

View File

@ -6,9 +6,7 @@ import { ContributionAccountAggregate } from '../../domain/aggregates/contributi
* CDC * CDC
* *
* *
* handler tx * 100%
* 使
*
*/ */
@Injectable() @Injectable()
export class UserSyncedHandler { export class UserSyncedHandler {
@ -49,22 +47,19 @@ export class UserSyncedHandler {
return; return;
} }
// 兼容不同的字段命名CDC 使用 snake_case
const userId = data.user_id ?? data.id; const userId = data.user_id ?? data.id;
const accountSequence = data.account_sequence ?? data.accountSequence; const accountSequence = data.account_sequence ?? data.accountSequence;
const phone = data.phone_number ?? data.phone ?? null; const phone = data.phone_number ?? data.phone ?? null;
const status = data.status ?? 'ACTIVE'; const status = data.status ?? null;
this.logger.log(`[CDC] User create: userId=${userId}, accountSequence=${accountSequence}, phone=${phone}, status=${status}`); this.logger.log(`[CDC] User create: userId=${userId}, accountSequence=${accountSequence}, status=${status}`);
if (!userId || !accountSequence) { if (!userId || !accountSequence) {
this.logger.warn(`[CDC] Invalid user data: missing user_id or account_sequence`, { data }); this.logger.warn(`[CDC] Invalid user data: missing user_id or account_sequence`, { data });
return; return;
} }
// 使用外部事务客户端执行所有操作 // 100%同步数据
// 保存同步的用户数据
this.logger.log(`[CDC] Upserting synced user: ${accountSequence}`);
await tx.syncedUser.upsert({ await tx.syncedUser.upsert({
where: { accountSequence }, where: { accountSequence },
create: { create: {
@ -76,8 +71,9 @@ export class UserSyncedHandler {
syncedAt: new Date(), syncedAt: new Date(),
}, },
update: { update: {
phone: phone ?? undefined, originalUserId: BigInt(userId),
status: status ?? undefined, phone,
status,
sourceSequenceNum: sequenceNum, sourceSequenceNum: sequenceNum,
syncedAt: new Date(), syncedAt: new Date(),
}, },
@ -95,11 +91,9 @@ export class UserSyncedHandler {
data: persistData, data: persistData,
}); });
this.logger.log(`[CDC] Created contribution account for user: ${accountSequence}`); this.logger.log(`[CDC] Created contribution account for user: ${accountSequence}`);
} else {
this.logger.debug(`[CDC] Contribution account already exists for user: ${accountSequence}`);
} }
this.logger.log(`[CDC] User synced successfully: ${accountSequence}`); this.logger.log(`[CDC] User synced: ${accountSequence}`);
} }
private async handleUpdate(data: any, sequenceNum: bigint, tx: TransactionClient): Promise<void> { private async handleUpdate(data: any, sequenceNum: bigint, tx: TransactionClient): Promise<void> {
@ -108,11 +102,10 @@ export class UserSyncedHandler {
return; return;
} }
// 兼容不同的字段命名CDC 使用 snake_case
const userId = data.user_id ?? data.id; const userId = data.user_id ?? data.id;
const accountSequence = data.account_sequence ?? data.accountSequence; const accountSequence = data.account_sequence ?? data.accountSequence;
const phone = data.phone_number ?? data.phone ?? null; const phone = data.phone_number ?? data.phone ?? null;
const status = data.status ?? 'ACTIVE'; const status = data.status ?? null;
this.logger.log(`[CDC] User update: userId=${userId}, accountSequence=${accountSequence}, status=${status}`); this.logger.log(`[CDC] User update: userId=${userId}, accountSequence=${accountSequence}, status=${status}`);
@ -121,6 +114,7 @@ export class UserSyncedHandler {
return; return;
} }
// 100%同步数据
await tx.syncedUser.upsert({ await tx.syncedUser.upsert({
where: { accountSequence }, where: { accountSequence },
create: { create: {
@ -132,14 +126,15 @@ export class UserSyncedHandler {
syncedAt: new Date(), syncedAt: new Date(),
}, },
update: { update: {
phone: phone ?? undefined, originalUserId: BigInt(userId),
status: status ?? undefined, phone,
status,
sourceSequenceNum: sequenceNum, sourceSequenceNum: sequenceNum,
syncedAt: new Date(), syncedAt: new Date(),
}, },
}); });
this.logger.log(`[CDC] User updated successfully: ${accountSequence}`); this.logger.log(`[CDC] User synced: ${accountSequence}`);
} }
private async handleDelete(data: any): Promise<void> { private async handleDelete(data: any): Promise<void> {

View File

@ -183,16 +183,16 @@ export class GetContributionAccountQuery {
private toRecordDto(record: any): ContributionRecordDto { private toRecordDto(record: any): ContributionRecordDto {
return { return {
id: record.id, id: record.id?.toString() ?? '',
sourceType: record.sourceType, sourceType: record.sourceType,
sourceAdoptionId: record.sourceAdoptionId, sourceAdoptionId: record.sourceAdoptionId?.toString() ?? '',
sourceAccountSequence: record.sourceAccountSequence, sourceAccountSequence: record.sourceAccountSequence,
treeCount: record.treeCount, treeCount: record.treeCount,
baseContribution: record.baseContribution.value.toString(), baseContribution: record.baseContribution?.value?.toString() ?? '0',
distributionRate: record.distributionRate.value.toString(), distributionRate: record.distributionRate?.value?.toString() ?? '0',
levelDepth: record.levelDepth, levelDepth: record.levelDepth,
bonusTier: record.bonusTier, bonusTier: record.bonusTier,
finalContribution: record.finalContribution.value.toString(), finalContribution: record.amount?.value?.toString() ?? '0',
effectiveDate: record.effectiveDate, effectiveDate: record.effectiveDate,
expireDate: record.expireDate, expireDate: record.expireDate,
isExpired: record.isExpired, isExpired: record.isExpired,

View File

@ -1,4 +1,5 @@
import { Injectable } from '@nestjs/common'; import { Injectable } from '@nestjs/common';
import Decimal from 'decimal.js';
import { ContributionAccountRepository } from '../../infrastructure/persistence/repositories/contribution-account.repository'; import { ContributionAccountRepository } from '../../infrastructure/persistence/repositories/contribution-account.repository';
import { ContributionRecordRepository } from '../../infrastructure/persistence/repositories/contribution-record.repository'; import { ContributionRecordRepository } from '../../infrastructure/persistence/repositories/contribution-record.repository';
import { UnallocatedContributionRepository } from '../../infrastructure/persistence/repositories/unallocated-contribution.repository'; import { UnallocatedContributionRepository } from '../../infrastructure/persistence/repositories/unallocated-contribution.repository';
@ -6,6 +7,15 @@ import { SystemAccountRepository } from '../../infrastructure/persistence/reposi
import { SyncedDataRepository } from '../../infrastructure/persistence/repositories/synced-data.repository'; import { SyncedDataRepository } from '../../infrastructure/persistence/repositories/synced-data.repository';
import { ContributionSourceType } from '../../domain/aggregates/contribution-account.aggregate'; import { ContributionSourceType } from '../../domain/aggregates/contribution-account.aggregate';
// 基准算力常量
const BASE_CONTRIBUTION_PER_TREE = new Decimal('22617');
const RATE_PERSONAL = new Decimal('0.70');
const RATE_OPERATION = new Decimal('0.12');
const RATE_PROVINCE = new Decimal('0.01');
const RATE_CITY = new Decimal('0.02');
const RATE_LEVEL_TOTAL = new Decimal('0.075');
const RATE_BONUS_TOTAL = new Decimal('0.075');
export interface ContributionStatsDto { export interface ContributionStatsDto {
// 用户统计 // 用户统计
totalUsers: number; totalUsers: number;
@ -16,17 +26,57 @@ export interface ContributionStatsDto {
totalAdoptions: number; totalAdoptions: number;
processedAdoptions: number; processedAdoptions: number;
unprocessedAdoptions: number; unprocessedAdoptions: number;
totalTrees: number;
// 算力统计 // 算力统计
totalContribution: string; totalContribution: string;
// 算力分布 // 算力分布(基础)
contributionByType: { contributionByType: {
personal: string; personal: string;
teamLevel: string; teamLevel: string;
teamBonus: string; teamBonus: string;
}; };
// ========== 详细算力分解(按用户需求) ==========
// 全网算力 = 总认种树 * 22617
networkTotalContribution: string;
// 个人用户总算力 = 总认种树 * (22617 * 70%)
personalTotalContribution: string;
// 运营账户总算力 = 总认种树 * (22617 * 12%)
operationTotalContribution: string;
// 省公司总算力 = 总认种树 * (22617 * 1%)
provinceTotalContribution: string;
// 市公司总算力 = 总认种树 * (22617 * 2%)
cityTotalContribution: string;
// 层级算力详情 (7.5%)
levelContribution: {
total: string;
unlocked: string;
pending: string;
byTier: {
// 1档: 1-5级
tier1: { unlocked: string; pending: string };
// 2档: 6-10级
tier2: { unlocked: string; pending: string };
// 3档: 11-15级
tier3: { unlocked: string; pending: string };
};
};
// 团队奖励算力详情 (7.5%)
bonusContribution: {
total: string;
unlocked: string;
pending: string;
byTier: {
tier1: { unlocked: string; pending: string };
tier2: { unlocked: string; pending: string };
tier3: { unlocked: string; pending: string };
};
};
// 系统账户 // 系统账户
systemAccounts: { systemAccounts: {
accountType: string; accountType: string;
@ -61,6 +111,10 @@ export class GetContributionStatsQuery {
systemAccounts, systemAccounts,
totalUnallocated, totalUnallocated,
unallocatedByType, unallocatedByType,
detailedStats,
unallocatedByLevelTier,
unallocatedByBonusTier,
totalTrees,
] = await Promise.all([ ] = await Promise.all([
this.syncedDataRepository.countUsers(), this.syncedDataRepository.countUsers(),
this.accountRepository.countAccounts(), this.accountRepository.countAccounts(),
@ -72,8 +126,33 @@ export class GetContributionStatsQuery {
this.systemAccountRepository.findAll(), this.systemAccountRepository.findAll(),
this.unallocatedRepository.getTotalUnallocated(), this.unallocatedRepository.getTotalUnallocated(),
this.unallocatedRepository.getTotalUnallocatedByType(), this.unallocatedRepository.getTotalUnallocatedByType(),
this.accountRepository.getDetailedContributionStats(),
this.unallocatedRepository.getUnallocatedByLevelTier(),
this.unallocatedRepository.getUnallocatedByBonusTier(),
this.syncedDataRepository.getTotalTrees(),
]); ]);
// 计算理论算力(基于总认种树 * 基准算力)
const networkTotal = BASE_CONTRIBUTION_PER_TREE.mul(totalTrees);
const personalTotal = networkTotal.mul(RATE_PERSONAL);
const operationTotal = networkTotal.mul(RATE_OPERATION);
const provinceTotal = networkTotal.mul(RATE_PROVINCE);
const cityTotal = networkTotal.mul(RATE_CITY);
const levelTotal = networkTotal.mul(RATE_LEVEL_TOTAL);
const bonusTotal = networkTotal.mul(RATE_BONUS_TOTAL);
// 层级算力: 已解锁 + 未解锁
const levelUnlocked = new Decimal(detailedStats.levelUnlocked);
const levelPending = new Decimal(unallocatedByLevelTier.tier1)
.plus(unallocatedByLevelTier.tier2)
.plus(unallocatedByLevelTier.tier3);
// 团队奖励算力: 已解锁 + 未解锁
const bonusUnlocked = new Decimal(detailedStats.bonusUnlocked);
const bonusPending = new Decimal(unallocatedByBonusTier.tier1)
.plus(unallocatedByBonusTier.tier2)
.plus(unallocatedByBonusTier.tier3);
return { return {
totalUsers, totalUsers,
totalAccounts, totalAccounts,
@ -81,12 +160,63 @@ export class GetContributionStatsQuery {
totalAdoptions, totalAdoptions,
processedAdoptions: totalAdoptions - undistributedAdoptions, processedAdoptions: totalAdoptions - undistributedAdoptions,
unprocessedAdoptions: undistributedAdoptions, unprocessedAdoptions: undistributedAdoptions,
totalTrees,
totalContribution: totalContribution.value.toString(), totalContribution: totalContribution.value.toString(),
contributionByType: { contributionByType: {
personal: (contributionByType.get(ContributionSourceType.PERSONAL)?.value || 0).toString(), personal: (contributionByType.get(ContributionSourceType.PERSONAL)?.value || 0).toString(),
teamLevel: (contributionByType.get(ContributionSourceType.TEAM_LEVEL)?.value || 0).toString(), teamLevel: (contributionByType.get(ContributionSourceType.TEAM_LEVEL)?.value || 0).toString(),
teamBonus: (contributionByType.get(ContributionSourceType.TEAM_BONUS)?.value || 0).toString(), teamBonus: (contributionByType.get(ContributionSourceType.TEAM_BONUS)?.value || 0).toString(),
}, },
// 详细算力分解
networkTotalContribution: networkTotal.toString(),
personalTotalContribution: personalTotal.toString(),
operationTotalContribution: operationTotal.toString(),
provinceTotalContribution: provinceTotal.toString(),
cityTotalContribution: cityTotal.toString(),
// 层级算力详情
levelContribution: {
total: levelTotal.toString(),
unlocked: levelUnlocked.toString(),
pending: levelPending.toString(),
byTier: {
tier1: {
unlocked: detailedStats.levelByTier.tier1.unlocked,
pending: unallocatedByLevelTier.tier1,
},
tier2: {
unlocked: detailedStats.levelByTier.tier2.unlocked,
pending: unallocatedByLevelTier.tier2,
},
tier3: {
unlocked: detailedStats.levelByTier.tier3.unlocked,
pending: unallocatedByLevelTier.tier3,
},
},
},
// 团队奖励算力详情
bonusContribution: {
total: bonusTotal.toString(),
unlocked: bonusUnlocked.toString(),
pending: bonusPending.toString(),
byTier: {
tier1: {
unlocked: detailedStats.bonusByTier.tier1.unlocked,
pending: unallocatedByBonusTier.tier1,
},
tier2: {
unlocked: detailedStats.bonusByTier.tier2.unlocked,
pending: unallocatedByBonusTier.tier2,
},
tier3: {
unlocked: detailedStats.bonusByTier.tier3.unlocked,
pending: unallocatedByBonusTier.tier3,
},
},
},
systemAccounts: systemAccounts.map((a) => ({ systemAccounts: systemAccounts.map((a) => ({
accountType: a.accountType, accountType: a.accountType,
name: a.name, name: a.name,
@ -98,4 +228,5 @@ export class GetContributionStatsQuery {
), ),
}; };
} }
} }

View File

@ -0,0 +1,85 @@
import { Injectable } from '@nestjs/common';
import { SyncedDataRepository } from '../../infrastructure/persistence/repositories/synced-data.repository';
import { ContributionAccountRepository } from '../../infrastructure/persistence/repositories/contribution-account.repository';
export interface PlantingRecordDto {
orderId: string;
orderNo: string;
originalAdoptionId: string;
treeCount: number;
contributionPerTree: string;
totalContribution: string;
status: string;
adoptionDate: string | null;
createdAt: string;
}
export interface PlantingSummaryDto {
totalOrders: number;
totalTreeCount: number;
totalAmount: string;
effectiveTreeCount: number;
/** 用户实际的有效贡献值(个人算力) */
effectiveContribution: string;
firstPlantingAt: string | null;
lastPlantingAt: string | null;
}
export interface PlantingLedgerDto {
summary: PlantingSummaryDto;
items: PlantingRecordDto[];
total: number;
page: number;
pageSize: number;
totalPages: number;
}
@Injectable()
export class GetPlantingLedgerQuery {
constructor(
private readonly syncedDataRepository: SyncedDataRepository,
private readonly contributionAccountRepository: ContributionAccountRepository,
) {}
async execute(
accountSequence: string,
page: number = 1,
pageSize: number = 20,
): Promise<PlantingLedgerDto> {
const [summary, ledger, contributionAccount] = await Promise.all([
this.syncedDataRepository.getPlantingSummary(accountSequence),
this.syncedDataRepository.getPlantingLedger(accountSequence, page, pageSize),
this.contributionAccountRepository.findByAccountSequence(accountSequence),
]);
// 获取用户实际的有效贡献值(个人算力)
const effectiveContribution = contributionAccount?.personalContribution.toString() || '0';
return {
summary: {
totalOrders: summary.totalOrders,
totalTreeCount: summary.totalTreeCount,
totalAmount: summary.totalAmount,
effectiveTreeCount: summary.effectiveTreeCount,
effectiveContribution,
firstPlantingAt: summary.firstPlantingAt?.toISOString() || null,
lastPlantingAt: summary.lastPlantingAt?.toISOString() || null,
},
items: ledger.items.map((item) => ({
orderId: item.id.toString(),
orderNo: `ORD-${item.originalAdoptionId}`,
originalAdoptionId: item.originalAdoptionId.toString(),
treeCount: item.treeCount,
contributionPerTree: item.contributionPerTree.toString(),
totalContribution: item.contributionPerTree.mul(item.treeCount).toString(),
status: item.status || 'UNKNOWN',
adoptionDate: item.adoptionDate?.toISOString() || null,
createdAt: item.createdAt.toISOString(),
})),
total: ledger.total,
page: ledger.page,
pageSize: ledger.pageSize,
totalPages: ledger.totalPages,
};
}
}

View File

@ -0,0 +1,121 @@
import { Injectable, Inject } from '@nestjs/common';
import {
ISyncedDataRepository,
SYNCED_DATA_REPOSITORY,
} from '../../domain/repositories/synced-data.repository.interface';
/**
*
*/
export interface TeamMemberDto {
accountSequence: string;
personalPlantingCount: number;
teamPlantingCount: number;
directReferralCount: number;
}
/**
*
*/
export interface DirectReferralsResponseDto {
referrals: TeamMemberDto[];
total: number;
hasMore: boolean;
}
/**
*
*/
export interface MyTeamInfoDto {
accountSequence: string;
personalPlantingCount: number;
teamPlantingCount: number;
directReferralCount: number;
}
@Injectable()
export class GetTeamTreeQuery {
constructor(
@Inject(SYNCED_DATA_REPOSITORY)
private readonly syncedDataRepository: ISyncedDataRepository,
) {}
/**
*
*/
async getMyTeamInfo(accountSequence: string): Promise<MyTeamInfoDto> {
// 获取个人认种棵数
const personalPlantingCount = await this.syncedDataRepository.getTotalTreesByAccountSequence(accountSequence);
// 获取直推数量
const directReferrals = await this.syncedDataRepository.findDirectReferrals(accountSequence);
// 获取团队认种棵数(伞下各级总和)
const teamTreesByLevel = await this.syncedDataRepository.getTeamTreesByLevel(accountSequence, 15);
let teamPlantingCount = 0;
teamTreesByLevel.forEach((count) => {
teamPlantingCount += count;
});
return {
accountSequence,
personalPlantingCount,
teamPlantingCount,
directReferralCount: directReferrals.length,
};
}
/**
*
*/
async getDirectReferrals(
accountSequence: string,
limit: number = 100,
offset: number = 0,
): Promise<DirectReferralsResponseDto> {
// 获取所有直推
const allDirectReferrals = await this.syncedDataRepository.findDirectReferrals(accountSequence);
// 分页
const total = allDirectReferrals.length;
const paginatedReferrals = allDirectReferrals.slice(offset, offset + limit);
// 获取每个直推成员的详细信息
const referrals: TeamMemberDto[] = await Promise.all(
paginatedReferrals.map(async (ref) => {
return this.getTeamMemberInfo(ref.accountSequence);
}),
);
return {
referrals,
total,
hasMore: offset + limit < total,
};
}
/**
*
*/
private async getTeamMemberInfo(accountSequence: string): Promise<TeamMemberDto> {
// 获取个人认种棵数
const personalPlantingCount = await this.syncedDataRepository.getTotalTreesByAccountSequence(accountSequence);
// 获取直推数量
const directReferrals = await this.syncedDataRepository.findDirectReferrals(accountSequence);
// 获取团队认种棵数
const teamTreesByLevel = await this.syncedDataRepository.getTeamTreesByLevel(accountSequence, 15);
let teamPlantingCount = 0;
teamTreesByLevel.forEach((count) => {
teamPlantingCount += count;
});
return {
accountSequence,
personalPlantingCount,
teamPlantingCount,
directReferralCount: directReferrals.length,
};
}
}

View File

@ -3,9 +3,11 @@ import { Cron, CronExpression } from '@nestjs/schedule';
import { ContributionCalculationService } from '../services/contribution-calculation.service'; import { ContributionCalculationService } from '../services/contribution-calculation.service';
import { SnapshotService } from '../services/snapshot.service'; import { SnapshotService } from '../services/snapshot.service';
import { ContributionRecordRepository } from '../../infrastructure/persistence/repositories/contribution-record.repository'; import { ContributionRecordRepository } from '../../infrastructure/persistence/repositories/contribution-record.repository';
import { ContributionAccountRepository } from '../../infrastructure/persistence/repositories/contribution-account.repository';
import { OutboxRepository } from '../../infrastructure/persistence/repositories/outbox.repository'; import { OutboxRepository } from '../../infrastructure/persistence/repositories/outbox.repository';
import { KafkaProducerService } from '../../infrastructure/kafka/kafka-producer.service'; import { KafkaProducerService } from '../../infrastructure/kafka/kafka-producer.service';
import { RedisService } from '../../infrastructure/redis/redis.service'; import { RedisService } from '../../infrastructure/redis/redis.service';
import { ContributionAccountUpdatedEvent } from '../../domain/events';
/** /**
* *
@ -19,6 +21,7 @@ export class ContributionScheduler implements OnModuleInit {
private readonly calculationService: ContributionCalculationService, private readonly calculationService: ContributionCalculationService,
private readonly snapshotService: SnapshotService, private readonly snapshotService: SnapshotService,
private readonly contributionRecordRepository: ContributionRecordRepository, private readonly contributionRecordRepository: ContributionRecordRepository,
private readonly contributionAccountRepository: ContributionAccountRepository,
private readonly outboxRepository: OutboxRepository, private readonly outboxRepository: OutboxRepository,
private readonly kafkaProducer: KafkaProducerService, private readonly kafkaProducer: KafkaProducerService,
private readonly redis: RedisService, private readonly redis: RedisService,
@ -174,4 +177,128 @@ export class ContributionScheduler implements OnModuleInit {
await this.redis.releaseLock(`${this.LOCK_KEY}:cleanup`, lockValue); await this.redis.releaseLock(`${this.LOCK_KEY}:cleanup`, lockValue);
} }
} }
/**
* 10
* 15
*/
@Cron('*/10 * * * *')
async publishRecentlyUpdatedAccounts(): Promise<void> {
const lockValue = await this.redis.acquireLock(`${this.LOCK_KEY}:incremental-sync`, 540); // 9分钟锁
if (!lockValue) {
return;
}
try {
// 查找过去15分钟内更新的账户比10分钟多5分钟余量避免遗漏边界情况
const fifteenMinutesAgo = new Date(Date.now() - 15 * 60 * 1000);
const accounts = await this.contributionAccountRepository.findRecentlyUpdated(fifteenMinutesAgo, 500);
if (accounts.length === 0) {
return;
}
const events = accounts.map((account) => {
const event = new ContributionAccountUpdatedEvent(
account.accountSequence,
account.personalContribution.value.toString(),
account.totalLevelPending.value.toString(),
account.totalBonusPending.value.toString(),
account.effectiveContribution.value.toString(),
account.effectiveContribution.value.toString(),
account.hasAdopted,
account.directReferralAdoptedCount,
account.unlockedLevelDepth,
account.unlockedBonusTiers,
account.createdAt,
);
return {
aggregateType: ContributionAccountUpdatedEvent.AGGREGATE_TYPE,
aggregateId: account.accountSequence,
eventType: ContributionAccountUpdatedEvent.EVENT_TYPE,
payload: event.toPayload(),
};
});
await this.outboxRepository.saveMany(events);
this.logger.log(`Incremental sync: published ${accounts.length} recently updated accounts`);
} catch (error) {
this.logger.error('Failed to publish recently updated accounts', error);
} finally {
await this.redis.releaseLock(`${this.LOCK_KEY}:incremental-sync`, lockValue);
}
}
/**
* 4
*
*/
@Cron('0 4 * * *')
async publishAllAccountUpdates(): Promise<void> {
const lockValue = await this.redis.acquireLock(`${this.LOCK_KEY}:full-sync`, 3600); // 1小时锁
if (!lockValue) {
return;
}
try {
this.logger.log('Starting daily full sync of contribution accounts...');
let page = 1;
const pageSize = 100;
let totalPublished = 0;
while (true) {
const { items: accounts, total } = await this.contributionAccountRepository.findMany({
page,
limit: pageSize,
orderBy: 'effectiveContribution',
order: 'desc',
});
if (accounts.length === 0) {
break;
}
const events = accounts.map((account) => {
const event = new ContributionAccountUpdatedEvent(
account.accountSequence,
account.personalContribution.value.toString(),
account.totalLevelPending.value.toString(),
account.totalBonusPending.value.toString(),
account.effectiveContribution.value.toString(),
account.effectiveContribution.value.toString(),
account.hasAdopted,
account.directReferralAdoptedCount,
account.unlockedLevelDepth,
account.unlockedBonusTiers,
account.createdAt,
);
return {
aggregateType: ContributionAccountUpdatedEvent.AGGREGATE_TYPE,
aggregateId: account.accountSequence,
eventType: ContributionAccountUpdatedEvent.EVENT_TYPE,
payload: event.toPayload(),
};
});
await this.outboxRepository.saveMany(events);
totalPublished += accounts.length;
if (accounts.length < pageSize || page * pageSize >= total) {
break;
}
page++;
}
this.logger.log(`Daily full sync completed: published ${totalPublished} contribution account events`);
} catch (error) {
this.logger.error('Failed to publish all account updates', error);
} finally {
await this.redis.releaseLock(`${this.LOCK_KEY}:full-sync`, lockValue);
}
}
} }

View File

@ -0,0 +1,274 @@
import { Injectable, Logger } from '@nestjs/common';
import { UnallocatedContributionRepository, UnallocatedContribution } from '../../infrastructure/persistence/repositories/unallocated-contribution.repository';
import { ContributionAccountRepository } from '../../infrastructure/persistence/repositories/contribution-account.repository';
import { ContributionRecordRepository } from '../../infrastructure/persistence/repositories/contribution-record.repository';
import { SystemAccountRepository } from '../../infrastructure/persistence/repositories/system-account.repository';
import { OutboxRepository } from '../../infrastructure/persistence/repositories/outbox.repository';
import { SyncedDataRepository } from '../../infrastructure/persistence/repositories/synced-data.repository';
import { UnitOfWork } from '../../infrastructure/persistence/unit-of-work/unit-of-work';
import { ContributionRecordAggregate } from '../../domain/aggregates/contribution-record.aggregate';
import { ContributionSourceType } from '../../domain/aggregates/contribution-account.aggregate';
import { ContributionAmount } from '../../domain/value-objects/contribution-amount.vo';
import { DistributionRate } from '../../domain/value-objects/distribution-rate.vo';
import { ContributionRecordSyncedEvent, SystemAccountSyncedEvent } from '../../domain/events';
/**
*
*
*/
@Injectable()
export class BonusClaimService {
private readonly logger = new Logger(BonusClaimService.name);
constructor(
private readonly unallocatedContributionRepository: UnallocatedContributionRepository,
private readonly contributionAccountRepository: ContributionAccountRepository,
private readonly contributionRecordRepository: ContributionRecordRepository,
private readonly systemAccountRepository: SystemAccountRepository,
private readonly outboxRepository: OutboxRepository,
private readonly syncedDataRepository: SyncedDataRepository,
private readonly unitOfWork: UnitOfWork,
) {}
/**
*
*
* @param accountSequence
* @param previousCount
* @param newCount
*/
async checkAndClaimBonus(
accountSequence: string,
previousCount: number,
newCount: number,
): Promise<void> {
// 检查是否达到新的解锁条件
const tiersToClaimList: number[] = [];
// T2: 直推≥2人认种时解锁
if (previousCount < 2 && newCount >= 2) {
tiersToClaimList.push(2);
}
// T3: 直推≥4人认种时解锁
if (previousCount < 4 && newCount >= 4) {
tiersToClaimList.push(3);
}
if (tiersToClaimList.length === 0) {
return;
}
this.logger.log(
`User ${accountSequence} unlocked bonus tiers: ${tiersToClaimList.join(', ')} ` +
`(directReferralAdoptedCount: ${previousCount} -> ${newCount})`,
);
// 检查是否已在事务中(被 ContributionCalculationService 调用时)
// 如果已在事务中,直接执行,避免嵌套事务导致超时
if (this.unitOfWork.isInTransaction()) {
for (const tier of tiersToClaimList) {
await this.claimBonusTier(accountSequence, tier);
}
} else {
// 独立调用时,开启新事务
await this.unitOfWork.executeInTransaction(async () => {
for (const tier of tiersToClaimList) {
await this.claimBonusTier(accountSequence, tier);
}
});
}
}
/**
*
*/
private async claimBonusTier(accountSequence: string, bonusTier: number): Promise<void> {
// 1. 查询待领取的记录
const pendingRecords = await this.unallocatedContributionRepository.findPendingBonusByAccountSequence(
accountSequence,
bonusTier,
);
if (pendingRecords.length === 0) {
this.logger.debug(`No pending T${bonusTier} bonus records for ${accountSequence}`);
return;
}
this.logger.log(
`Claiming ${pendingRecords.length} T${bonusTier} bonus records for ${accountSequence}`,
);
// 2. 查询原始认种数据,获取 treeCount 和 baseContribution
const adoptionDataMap = new Map<string, { treeCount: number; baseContribution: ContributionAmount }>();
for (const pending of pendingRecords) {
const adoptionIdStr = pending.sourceAdoptionId.toString();
if (!adoptionDataMap.has(adoptionIdStr)) {
const adoption = await this.syncedDataRepository.findSyncedAdoptionByOriginalId(pending.sourceAdoptionId);
if (adoption) {
adoptionDataMap.set(adoptionIdStr, {
treeCount: adoption.treeCount,
baseContribution: new ContributionAmount(adoption.contributionPerTree),
});
} else {
// 如果找不到原始认种数据,使用默认值并记录警告
this.logger.warn(`Adoption not found for sourceAdoptionId: ${pending.sourceAdoptionId}, using default values`);
adoptionDataMap.set(adoptionIdStr, {
treeCount: 0,
baseContribution: new ContributionAmount(0),
});
}
}
}
// 3. 创建贡献值记录
const contributionRecords: ContributionRecordAggregate[] = [];
for (const pending of pendingRecords) {
const adoptionData = adoptionDataMap.get(pending.sourceAdoptionId.toString())!;
const record = new ContributionRecordAggregate({
accountSequence: accountSequence,
sourceType: ContributionSourceType.TEAM_BONUS,
sourceAdoptionId: pending.sourceAdoptionId,
sourceAccountSequence: pending.sourceAccountSequence,
treeCount: adoptionData.treeCount,
baseContribution: adoptionData.baseContribution,
distributionRate: DistributionRate.BONUS_PER,
bonusTier: bonusTier,
amount: pending.amount,
effectiveDate: pending.effectiveDate,
expireDate: pending.expireDate,
});
contributionRecords.push(record);
}
// 4. 保存贡献值记录
const savedRecords = await this.contributionRecordRepository.saveMany(contributionRecords);
// 5. 更新用户的贡献值账户
let totalAmount = new ContributionAmount(0);
for (const pending of pendingRecords) {
totalAmount = new ContributionAmount(totalAmount.value.plus(pending.amount.value));
}
await this.contributionAccountRepository.updateContribution(
accountSequence,
ContributionSourceType.TEAM_BONUS,
totalAmount,
null,
bonusTier,
);
// 6. 标记待领取记录为已分配
const pendingIds = pendingRecords.map((r) => r.id);
await this.unallocatedContributionRepository.claimBonusRecords(pendingIds, accountSequence);
// 7. 从 HEADQUARTERS 减少算力并删除明细记录
await this.systemAccountRepository.subtractContribution('HEADQUARTERS', null, totalAmount);
for (const pending of pendingRecords) {
await this.systemAccountRepository.deleteContributionRecordsByAdoption(
'HEADQUARTERS',
null,
pending.sourceAdoptionId,
pending.sourceAccountSequence,
);
}
// 8. 发布 HEADQUARTERS 账户更新事件
const headquartersAccount = await this.systemAccountRepository.findByTypeAndRegion('HEADQUARTERS', null);
if (headquartersAccount) {
const hqEvent = new SystemAccountSyncedEvent(
'HEADQUARTERS',
null,
headquartersAccount.name,
headquartersAccount.contributionBalance.value.toString(),
headquartersAccount.createdAt,
);
await this.outboxRepository.save({
aggregateType: SystemAccountSyncedEvent.AGGREGATE_TYPE,
aggregateId: 'HEADQUARTERS',
eventType: SystemAccountSyncedEvent.EVENT_TYPE,
payload: hqEvent.toPayload(),
});
}
// 9. 发布事件到 Kafka通过 Outbox
await this.publishBonusClaimEvents(accountSequence, savedRecords, pendingRecords);
this.logger.log(
`Claimed T${bonusTier} bonus for ${accountSequence}: ` +
`${pendingRecords.length} records, total amount: ${totalAmount.value.toString()}`,
);
}
/**
*
*/
private async publishBonusClaimEvents(
accountSequence: string,
savedRecords: ContributionRecordAggregate[],
pendingRecords: UnallocatedContribution[],
): Promise<void> {
// 1. 发布贡献值记录同步事件(用于 mining-admin-service CDC
for (const record of savedRecords) {
const event = new ContributionRecordSyncedEvent(
record.id!,
record.accountSequence,
record.sourceType,
record.sourceAdoptionId,
record.sourceAccountSequence,
record.treeCount,
record.baseContribution.value.toString(),
record.distributionRate.value.toString(),
record.levelDepth,
record.bonusTier,
record.amount.value.toString(),
record.effectiveDate,
record.expireDate,
record.isExpired,
record.createdAt,
);
await this.outboxRepository.save({
aggregateType: ContributionRecordSyncedEvent.AGGREGATE_TYPE,
aggregateId: record.id!.toString(),
eventType: ContributionRecordSyncedEvent.EVENT_TYPE,
payload: event.toPayload(),
});
}
// 2. 发布补发事件到 mining-wallet-service
const userContributions = savedRecords.map((record, index) => ({
accountSequence: record.accountSequence,
contributionType: 'TEAM_BONUS',
amount: record.amount.value.toString(),
bonusTier: record.bonusTier,
effectiveDate: record.effectiveDate.toISOString(),
expireDate: record.expireDate.toISOString(),
sourceAdoptionId: record.sourceAdoptionId.toString(),
sourceAccountSequence: record.sourceAccountSequence,
isBackfill: true, // 标记为补发
}));
const eventId = `bonus-claim-${accountSequence}-${Date.now()}`;
const payload = {
eventType: 'BonusClaimed',
eventId,
timestamp: new Date().toISOString(),
payload: {
accountSequence,
bonusTier: savedRecords[0]?.bonusTier,
claimedCount: savedRecords.length,
userContributions,
},
};
await this.outboxRepository.save({
eventType: 'BonusClaimed',
topic: 'contribution.bonus.claimed',
key: accountSequence,
payload,
aggregateId: accountSequence,
aggregateType: 'ContributionAccount',
});
}
}

View File

@ -9,10 +9,12 @@ import { OutboxRepository } from '../../infrastructure/persistence/repositories/
import { UnitOfWork } from '../../infrastructure/persistence/unit-of-work/unit-of-work'; import { UnitOfWork } from '../../infrastructure/persistence/unit-of-work/unit-of-work';
import { ContributionAccountAggregate, ContributionSourceType } from '../../domain/aggregates/contribution-account.aggregate'; import { ContributionAccountAggregate, ContributionSourceType } from '../../domain/aggregates/contribution-account.aggregate';
import { ContributionRecordAggregate } from '../../domain/aggregates/contribution-record.aggregate'; import { ContributionRecordAggregate } from '../../domain/aggregates/contribution-record.aggregate';
import { ContributionAmount } from '../../domain/value-objects/contribution-amount.vo';
import { SyncedReferral } from '../../domain/repositories/synced-data.repository.interface'; import { SyncedReferral } from '../../domain/repositories/synced-data.repository.interface';
import { ContributionDistributionPublisherService } from './contribution-distribution-publisher.service'; import { ContributionDistributionPublisherService } from './contribution-distribution-publisher.service';
import { ContributionRateService } from './contribution-rate.service'; import { ContributionRateService } from './contribution-rate.service';
import { ContributionRecordSyncedEvent, NetworkProgressUpdatedEvent } from '../../domain/events'; import { BonusClaimService } from './bonus-claim.service';
import { ContributionRecordSyncedEvent, NetworkProgressUpdatedEvent, ContributionAccountUpdatedEvent, SystemAccountSyncedEvent, SystemContributionRecordCreatedEvent, UnallocatedContributionSyncedEvent } from '../../domain/events';
/** /**
* *
@ -33,6 +35,7 @@ export class ContributionCalculationService {
private readonly unitOfWork: UnitOfWork, private readonly unitOfWork: UnitOfWork,
private readonly distributionPublisher: ContributionDistributionPublisherService, private readonly distributionPublisher: ContributionDistributionPublisherService,
private readonly contributionRateService: ContributionRateService, private readonly contributionRateService: ContributionRateService,
private readonly bonusClaimService: BonusClaimService,
) {} ) {}
/** /**
@ -111,6 +114,49 @@ export class ContributionCalculationService {
`teamBonus=${result.teamBonusRecords.length}, ` + `teamBonus=${result.teamBonusRecords.length}, ` +
`unallocated=${result.unallocatedContributions.length}`, `unallocated=${result.unallocatedContributions.length}`,
); );
// 更新全网认种进度(更新 NetworkAdoptionProgress 表)
// 判断是否为新认种用户:之前没有账户记录即为新用户
const isNewUser = !adopterAccount;
await this.contributionRateService.updateNetworkProgress(
adoption.treeCount,
adoption.adoptionDate,
adoption.originalAdoptionId,
isNewUser,
);
// 发布全网进度更新事件(用于 mining-service 同步全网理论算力)
await this.publishNetworkProgressEvent();
}
/**
*
*/
private async publishNetworkProgressEvent(): Promise<void> {
try {
const progress = await this.contributionRateService.getNetworkProgress();
const event = new NetworkProgressUpdatedEvent(
progress.totalTreeCount,
progress.totalAdoptionOrders,
progress.totalAdoptedUsers,
progress.currentUnit,
progress.currentMultiplier.toString(),
progress.currentContributionPerTree.toString(),
progress.nextUnitTreeCount,
);
await this.outboxRepository.save({
aggregateType: NetworkProgressUpdatedEvent.AGGREGATE_TYPE,
aggregateId: 'network',
eventType: NetworkProgressUpdatedEvent.EVENT_TYPE,
payload: event.toPayload(),
});
this.logger.debug(`Published NetworkProgressUpdatedEvent: trees=${progress.totalTreeCount}`);
} catch (error) {
this.logger.error('Failed to publish NetworkProgressUpdatedEvent', error);
}
} }
/** /**
@ -164,6 +210,8 @@ export class ContributionCalculationService {
): Promise<void> { ): Promise<void> {
// 收集所有保存后的记录带ID用于发布事件 // 收集所有保存后的记录带ID用于发布事件
const savedRecords: ContributionRecordAggregate[] = []; const savedRecords: ContributionRecordAggregate[] = [];
// 收集所有被更新的账户序列号(用于发布账户更新事件)
const updatedAccountSequences = new Set<string>();
// 1. 保存个人算力记录 // 1. 保存个人算力记录
const savedPersonalRecord = await this.contributionRecordRepository.save(result.personalRecord); const savedPersonalRecord = await this.contributionRecordRepository.save(result.personalRecord);
@ -178,6 +226,7 @@ export class ContributionCalculationService {
} }
account.addPersonalContribution(result.personalRecord.amount); account.addPersonalContribution(result.personalRecord.amount);
await this.contributionAccountRepository.save(account); await this.contributionAccountRepository.save(account);
updatedAccountSequences.add(result.personalRecord.accountSequence);
// 2. 保存团队层级算力记录 // 2. 保存团队层级算力记录
if (result.teamLevelRecords.length > 0) { if (result.teamLevelRecords.length > 0) {
@ -193,6 +242,7 @@ export class ContributionCalculationService {
record.levelDepth, // 传递层级深度 record.levelDepth, // 传递层级深度
null, null,
); );
updatedAccountSequences.add(record.accountSequence);
} }
} }
@ -210,6 +260,7 @@ export class ContributionCalculationService {
null, null,
record.bonusTier, // 传递加成档位 record.bonusTier, // 传递加成档位
); );
updatedAccountSequences.add(record.accountSequence);
} }
} }
@ -217,7 +268,7 @@ export class ContributionCalculationService {
const effectiveDate = result.personalRecord.effectiveDate; const effectiveDate = result.personalRecord.effectiveDate;
const expireDate = result.personalRecord.expireDate; const expireDate = result.personalRecord.expireDate;
// 4. 保存未分配算力 // 4. 保存未分配算力并发布同步事件
if (result.unallocatedContributions.length > 0) { if (result.unallocatedContributions.length > 0) {
await this.unallocatedContributionRepository.saveMany( await this.unallocatedContributionRepository.saveMany(
result.unallocatedContributions.map((u) => ({ result.unallocatedContributions.map((u) => ({
@ -228,28 +279,189 @@ export class ContributionCalculationService {
expireDate, expireDate,
})), })),
); );
// 汇总未分配算力到 HEADQUARTERS总部账户
const totalUnallocatedAmount = result.unallocatedContributions.reduce(
(sum, u) => sum.add(u.amount),
new ContributionAmount(0),
);
await this.systemAccountRepository.addContribution(
'HEADQUARTERS',
null,
totalUnallocatedAmount,
);
// 为每笔未分配算力创建 HEADQUARTERS 明细记录
for (const unallocated of result.unallocatedContributions) {
// 确定来源类型和层级深度
const sourceType = unallocated.type as string; // LEVEL_OVERFLOW / LEVEL_NO_ANCESTOR / BONUS_TIER_1/2/3
const levelDepth = unallocated.levelDepth;
const savedRecord = await this.systemAccountRepository.saveContributionRecord({
accountType: 'HEADQUARTERS',
regionCode: null,
sourceAdoptionId,
sourceAccountSequence,
sourceType,
levelDepth,
distributionRate: 0, // 未分配算力没有固定比例
amount: unallocated.amount,
effectiveDate,
expireDate: null,
});
// 发布 HEADQUARTERS 算力明细事件
const recordEvent = new SystemContributionRecordCreatedEvent(
savedRecord.id,
'HEADQUARTERS',
null,
sourceAdoptionId,
sourceAccountSequence,
sourceType as any,
levelDepth,
0,
unallocated.amount.value.toString(),
effectiveDate,
null,
savedRecord.createdAt,
);
await this.outboxRepository.save({
aggregateType: SystemContributionRecordCreatedEvent.AGGREGATE_TYPE,
aggregateId: savedRecord.id.toString(),
eventType: SystemContributionRecordCreatedEvent.EVENT_TYPE,
payload: recordEvent.toPayload(),
});
}
// 发布 HEADQUARTERS 账户同步事件
const headquartersAccount = await this.systemAccountRepository.findByTypeAndRegion('HEADQUARTERS', null);
if (headquartersAccount) {
const hqEvent = new SystemAccountSyncedEvent(
'HEADQUARTERS',
null, // 区域代码(总部没有区域)
headquartersAccount.name,
headquartersAccount.contributionBalance.value.toString(),
headquartersAccount.createdAt,
);
await this.outboxRepository.save({
aggregateType: SystemAccountSyncedEvent.AGGREGATE_TYPE,
aggregateId: 'HEADQUARTERS',
eventType: SystemAccountSyncedEvent.EVENT_TYPE,
payload: hqEvent.toPayload(),
});
}
// 发布未分配算力同步事件(用于 mining-service 同步待解锁算力)
for (const unallocated of result.unallocatedContributions) {
const event = new UnallocatedContributionSyncedEvent(
sourceAdoptionId,
sourceAccountSequence,
unallocated.wouldBeAccountSequence,
unallocated.type,
unallocated.amount.value.toString(),
unallocated.reason,
effectiveDate,
expireDate,
);
await this.outboxRepository.save({
aggregateType: UnallocatedContributionSyncedEvent.AGGREGATE_TYPE,
aggregateId: `${sourceAdoptionId}-${unallocated.type}`,
eventType: UnallocatedContributionSyncedEvent.EVENT_TYPE,
payload: event.toPayload(),
});
}
} }
// 5. 保存系统账户算力 // 5. 保存系统账户算力并发布同步事件
if (result.systemContributions.length > 0) { if (result.systemContributions.length > 0) {
await this.systemAccountRepository.ensureSystemAccountsExist(); await this.systemAccountRepository.ensureSystemAccountsExist();
for (const sys of result.systemContributions) { for (const sys of result.systemContributions) {
await this.systemAccountRepository.addContribution(sys.accountType, sys.amount); // 动态创建/更新系统账户
await this.systemAccountRepository.saveContributionRecord({ await this.systemAccountRepository.addContribution(
systemAccountType: sys.accountType, sys.accountType,
sys.regionCode,
sys.amount,
);
// 保存算力明细记录
const savedRecord = await this.systemAccountRepository.saveContributionRecord({
accountType: sys.accountType,
regionCode: sys.regionCode,
sourceAdoptionId, sourceAdoptionId,
sourceAccountSequence, sourceAccountSequence,
sourceType: 'FIXED_RATE', // 固定比例分配
levelDepth: null,
distributionRate: sys.rate.value.toNumber(), distributionRate: sys.rate.value.toNumber(),
amount: sys.amount, amount: sys.amount,
effectiveDate, effectiveDate,
expireDate: null, // System account contributions never expire based on the schema's contributionNeverExpires field expireDate: null,
}); });
// 发布系统账户同步事件(用于 mining-service 同步系统账户算力)
const systemAccount = await this.systemAccountRepository.findByTypeAndRegion(
sys.accountType,
sys.regionCode,
);
if (systemAccount) {
const event = new SystemAccountSyncedEvent(
sys.accountType,
sys.regionCode,
systemAccount.name,
systemAccount.contributionBalance.value.toString(),
systemAccount.createdAt,
);
await this.outboxRepository.save({
aggregateType: SystemAccountSyncedEvent.AGGREGATE_TYPE,
aggregateId: `${sys.accountType}:${sys.regionCode || 'null'}`,
eventType: SystemAccountSyncedEvent.EVENT_TYPE,
payload: event.toPayload(),
});
// 发布系统账户算力明细事件(用于 mining-admin-service 同步明细记录)
const recordEvent = new SystemContributionRecordCreatedEvent(
savedRecord.id,
sys.accountType,
sys.regionCode, // 传递区域代码
sourceAdoptionId,
sourceAccountSequence,
'FIXED_RATE', // 固定比例分配
null, // 无层级深度
sys.rate.value.toNumber(),
sys.amount.value.toString(),
effectiveDate,
null,
savedRecord.createdAt,
);
await this.outboxRepository.save({
aggregateType: SystemContributionRecordCreatedEvent.AGGREGATE_TYPE,
aggregateId: savedRecord.id.toString(),
eventType: SystemContributionRecordCreatedEvent.EVENT_TYPE,
payload: recordEvent.toPayload(),
});
}
} }
} }
// 6. 发布算力记录同步事件(用于 mining-admin-service- 使用保存后带 ID 的记录 // 6. 发布算力记录同步事件(用于 mining-admin-service- 使用保存后带 ID 的记录
await this.publishContributionRecordEvents(savedRecords); await this.publishContributionRecordEvents(savedRecords);
// 7. 发布所有被更新账户的事件(用于 CDC 同步到 mining-admin-service
await this.publishUpdatedAccountEvents(updatedAccountSequences);
}
/**
*
*/
private async publishUpdatedAccountEvents(accountSequences: Set<string>): Promise<void> {
if (accountSequences.size === 0) return;
for (const accountSequence of accountSequences) {
const account = await this.contributionAccountRepository.findByAccountSequence(accountSequence);
if (account) {
await this.publishContributionAccountUpdatedEvent(account);
}
}
} }
/** /**
@ -300,11 +512,15 @@ export class ContributionCalculationService {
if (!account.hasAdopted) { if (!account.hasAdopted) {
account.markAsAdopted(); account.markAsAdopted();
await this.contributionAccountRepository.save(account); await this.contributionAccountRepository.save(account);
// 发布账户更新事件到 outbox用于 CDC 同步到 mining-admin-service
await this.publishContributionAccountUpdatedEvent(account);
} }
} }
/** /**
* 线 * 线
*
*/ */
private async updateReferrerUnlockStatus(referrerAccountSequence: string): Promise<void> { private async updateReferrerUnlockStatus(referrerAccountSequence: string): Promise<void> {
const account = await this.contributionAccountRepository.findByAccountSequence(referrerAccountSequence); const account = await this.contributionAccountRepository.findByAccountSequence(referrerAccountSequence);
@ -316,16 +532,27 @@ export class ContributionCalculationService {
); );
// 更新解锁状态 // 更新解锁状态
const currentCount = account.directReferralAdoptedCount; const previousCount = account.directReferralAdoptedCount;
if (directReferralAdoptedCount > currentCount) { if (directReferralAdoptedCount > previousCount) {
// 需要增量更新 // 需要增量更新
for (let i = currentCount; i < directReferralAdoptedCount; i++) { for (let i = previousCount; i < directReferralAdoptedCount; i++) {
account.incrementDirectReferralAdoptedCount(); account.incrementDirectReferralAdoptedCount();
} }
await this.contributionAccountRepository.save(account); await this.contributionAccountRepository.save(account);
// 发布账户更新事件到 outbox用于 CDC 同步到 mining-admin-service
await this.publishContributionAccountUpdatedEvent(account);
this.logger.debug( this.logger.debug(
`Updated referrer ${referrerAccountSequence} unlock status: level=${account.unlockedLevelDepth}, bonus=${account.unlockedBonusTiers}`, `Updated referrer ${referrerAccountSequence} unlock status: level=${account.unlockedLevelDepth}, bonus=${account.unlockedBonusTiers}`,
); );
// 检查并处理奖励补发T2: 直推≥2人, T3: 直推≥4人
await this.bonusClaimService.checkAndClaimBonus(
referrerAccountSequence,
previousCount,
directReferralAdoptedCount,
);
} }
} }
@ -393,4 +620,43 @@ export class ContributionCalculationService {
}, },
}; };
} }
/**
* CDC mining-admin-service
*/
private async publishContributionAccountUpdatedEvent(
account: ContributionAccountAggregate,
): Promise<void> {
// 总算力 = 个人算力 + 层级待解锁 + 加成待解锁
const totalContribution = account.personalContribution.value
.plus(account.totalLevelPending.value)
.plus(account.totalBonusPending.value);
const event = new ContributionAccountUpdatedEvent(
account.accountSequence,
account.personalContribution.value.toString(),
account.totalLevelPending.value.toString(),
account.totalBonusPending.value.toString(),
totalContribution.toString(),
account.effectiveContribution.value.toString(),
account.hasAdopted,
account.directReferralAdoptedCount,
account.unlockedLevelDepth,
account.unlockedBonusTiers,
account.createdAt,
);
await this.outboxRepository.save({
aggregateType: ContributionAccountUpdatedEvent.AGGREGATE_TYPE,
aggregateId: account.accountSequence,
eventType: ContributionAccountUpdatedEvent.EVENT_TYPE,
payload: event.toPayload(),
});
this.logger.debug(
`Published ContributionAccountUpdatedEvent for ${account.accountSequence}: ` +
`directReferralAdoptedCount=${account.directReferralAdoptedCount}, ` +
`hasAdopted=${account.hasAdopted}`,
);
}
} }

View File

@ -121,11 +121,16 @@ export class ContributionDistributionPublisherService {
return result.systemContributions.map((sys) => ({ return result.systemContributions.map((sys) => ({
accountType: sys.accountType, accountType: sys.accountType,
amount: sys.amount.value.toString(), amount: sys.amount.value.toString(),
// 省份代码PROVINCE 用自己的 regionCodeCITY 需要传递省份代码用于创建省份
provinceCode: provinceCode:
sys.accountType === 'PROVINCE' || sys.accountType === 'CITY' sys.accountType === 'PROVINCE'
? provinceCode ? sys.regionCode || provinceCode
: undefined, : sys.accountType === 'CITY'
cityCode: sys.accountType === 'CITY' ? cityCode : undefined, ? provinceCode // CITY 需要省份代码来创建省份(如果省份不存在)
: undefined,
// 城市代码:只有 CITY 类型有
cityCode:
sys.accountType === 'CITY' ? sys.regionCode || cityCode : undefined,
neverExpires: sys.accountType === 'OPERATION', // 运营账户永不过期 neverExpires: sys.accountType === 'OPERATION', // 运营账户永不过期
})); }));
} }

View File

@ -0,0 +1,40 @@
/**
*
* directReferralAdoptedCount, unlockedLevelDepth, unlockedBonusTiers
* mining-admin-service
*/
export class ContributionAccountUpdatedEvent {
static readonly EVENT_TYPE = 'ContributionAccountUpdated';
static readonly AGGREGATE_TYPE = 'ContributionAccount';
constructor(
public readonly accountSequence: string,
public readonly personalContribution: string,
public readonly teamLevelContribution: string,
public readonly teamBonusContribution: string,
public readonly totalContribution: string,
public readonly effectiveContribution: string,
public readonly hasAdopted: boolean,
public readonly directReferralAdoptedCount: number,
public readonly unlockedLevelDepth: number,
public readonly unlockedBonusTiers: number,
public readonly createdAt: Date,
) {}
toPayload(): Record<string, any> {
return {
eventType: ContributionAccountUpdatedEvent.EVENT_TYPE,
accountSequence: this.accountSequence,
personalContribution: this.personalContribution,
teamLevelContribution: this.teamLevelContribution,
teamBonusContribution: this.teamBonusContribution,
totalContribution: this.totalContribution,
effectiveContribution: this.effectiveContribution,
hasAdopted: this.hasAdopted,
directReferralAdoptedCount: this.directReferralAdoptedCount,
unlockedLevelDepth: this.unlockedLevelDepth,
unlockedBonusTiers: this.unlockedBonusTiers,
createdAt: this.createdAt.toISOString(),
};
}
}

View File

@ -1,7 +1,11 @@
export * from './contribution-calculated.event'; export * from './contribution-calculated.event';
export * from './daily-snapshot-created.event'; export * from './daily-snapshot-created.event';
export * from './contribution-account-synced.event'; export * from './contribution-account-synced.event';
export * from './contribution-account-updated.event';
export * from './referral-synced.event'; export * from './referral-synced.event';
export * from './adoption-synced.event'; export * from './adoption-synced.event';
export * from './contribution-record-synced.event'; export * from './contribution-record-synced.event';
export * from './network-progress-updated.event'; export * from './network-progress-updated.event';
export * from './system-account-synced.event';
export * from './system-contribution-record-created.event';
export * from './unallocated-contribution-synced.event';

View File

@ -0,0 +1,27 @@
/**
*
* mining-service
*/
export class SystemAccountSyncedEvent {
static readonly EVENT_TYPE = 'SystemAccountSynced';
static readonly AGGREGATE_TYPE = 'SystemAccount';
constructor(
public readonly accountType: string, // OPERATION / PROVINCE / CITY / HEADQUARTERS
public readonly regionCode: string | null, // 省/市代码,如 440000, 440100
public readonly name: string,
public readonly contributionBalance: string,
public readonly createdAt: Date,
) {}
toPayload(): Record<string, any> {
return {
eventType: SystemAccountSyncedEvent.EVENT_TYPE,
accountType: this.accountType,
regionCode: this.regionCode,
name: this.name,
contributionBalance: this.contributionBalance,
createdAt: this.createdAt.toISOString(),
};
}
}

View File

@ -0,0 +1,56 @@
/**
*
* - FIXED_RATE: 固定比例分配OPERATION 12%PROVINCE 1%CITY 2%
* - LEVEL_OVERFLOW: 层级溢出归总部线
* - LEVEL_NO_ANCESTOR: 无上线归总部线
* - BONUS_TIER_1/2/3: 团队奖励未解锁归总部
*/
export type SystemContributionSourceType =
| 'FIXED_RATE'
| 'LEVEL_OVERFLOW'
| 'LEVEL_NO_ANCESTOR'
| 'BONUS_TIER_1'
| 'BONUS_TIER_2'
| 'BONUS_TIER_3';
/**
*
* mining-admin-service
*/
export class SystemContributionRecordCreatedEvent {
static readonly EVENT_TYPE = 'SystemContributionRecordCreated';
static readonly AGGREGATE_TYPE = 'SystemContributionRecord';
constructor(
public readonly recordId: bigint, // 明细记录ID
public readonly accountType: string, // 系统账户类型OPERATION/PROVINCE/CITY/HEADQUARTERS
public readonly regionCode: string | null, // 区域代码(省/市代码,如 440000, 440100
public readonly sourceAdoptionId: bigint, // 来源认种ID
public readonly sourceAccountSequence: string, // 认种人账号
public readonly sourceType: SystemContributionSourceType, // 来源类型
public readonly levelDepth: number | null, // 层级深度1-15仅对 LEVEL_OVERFLOW/LEVEL_NO_ANCESTOR 有效
public readonly distributionRate: number, // 分配比例
public readonly amount: string, // 算力金额
public readonly effectiveDate: Date, // 生效日期
public readonly expireDate: Date | null, // 过期日期
public readonly createdAt: Date, // 创建时间
) {}
toPayload(): Record<string, any> {
return {
eventType: SystemContributionRecordCreatedEvent.EVENT_TYPE,
recordId: this.recordId.toString(),
accountType: this.accountType,
regionCode: this.regionCode,
sourceAdoptionId: this.sourceAdoptionId.toString(),
sourceAccountSequence: this.sourceAccountSequence,
sourceType: this.sourceType,
levelDepth: this.levelDepth,
distributionRate: this.distributionRate,
amount: this.amount,
effectiveDate: this.effectiveDate.toISOString(),
expireDate: this.expireDate?.toISOString() ?? null,
createdAt: this.createdAt.toISOString(),
};
}
}

View File

@ -0,0 +1,33 @@
/**
*
* mining-service
*/
export class UnallocatedContributionSyncedEvent {
static readonly EVENT_TYPE = 'UnallocatedContributionSynced';
static readonly AGGREGATE_TYPE = 'UnallocatedContribution';
constructor(
public readonly sourceAdoptionId: bigint,
public readonly sourceAccountSequence: string,
public readonly wouldBeAccountSequence: string | null,
public readonly contributionType: string, // LEVEL_NO_ANCESTOR, LEVEL_OVERFLOW, BONUS_TIER_1, BONUS_TIER_2, BONUS_TIER_3
public readonly amount: string,
public readonly reason: string | null,
public readonly effectiveDate: Date,
public readonly expireDate: Date,
) {}
toPayload(): Record<string, any> {
return {
eventType: UnallocatedContributionSyncedEvent.EVENT_TYPE,
sourceAdoptionId: this.sourceAdoptionId.toString(),
sourceAccountSequence: this.sourceAccountSequence,
wouldBeAccountSequence: this.wouldBeAccountSequence,
contributionType: this.contributionType,
amount: this.amount,
reason: this.reason,
effectiveDate: this.effectiveDate.toISOString(),
expireDate: this.expireDate.toISOString(),
};
}
}

View File

@ -5,6 +5,16 @@ import { ContributionAccountAggregate, ContributionSourceType } from '../aggrega
import { ContributionRecordAggregate } from '../aggregates/contribution-record.aggregate'; import { ContributionRecordAggregate } from '../aggregates/contribution-record.aggregate';
import { SyncedAdoption, SyncedReferral } from '../repositories/synced-data.repository.interface'; import { SyncedAdoption, SyncedReferral } from '../repositories/synced-data.repository.interface';
/**
*
*/
export interface SystemContributionAllocation {
accountType: 'OPERATION' | 'PROVINCE' | 'CITY' | 'HEADQUARTERS';
regionCode: string | null; // 省市代码,如 440000、440100
rate: DistributionRate;
amount: ContributionAmount;
}
/** /**
* *
*/ */
@ -27,12 +37,8 @@ export interface ContributionDistributionResult {
reason: string; reason: string;
}[]; }[];
// 系统账户贡献值 // 系统账户贡献值(支持按省市细分)
systemContributions: { systemContributions: SystemContributionAllocation[];
accountType: 'OPERATION' | 'PROVINCE' | 'CITY';
rate: DistributionRate;
amount: ContributionAmount;
}[];
} }
/** /**
@ -85,23 +91,31 @@ export class ContributionCalculatorService {
}); });
// 2. 系统账户贡献值 (15%) // 2. 系统账户贡献值 (15%)
result.systemContributions = [ // 运营账户(全国)- 12%
{ result.systemContributions.push({
accountType: 'OPERATION', accountType: 'OPERATION',
rate: DistributionRate.OPERATION, regionCode: null,
amount: totalContribution.multiply(DistributionRate.OPERATION.value), rate: DistributionRate.OPERATION,
}, amount: totalContribution.multiply(DistributionRate.OPERATION.value),
{ });
accountType: 'PROVINCE',
rate: DistributionRate.PROVINCE, // 省公司账户 - 1%(按认种选择的省份)
amount: totalContribution.multiply(DistributionRate.PROVINCE.value), const provinceCode = adoption.selectedProvince;
}, result.systemContributions.push({
{ accountType: 'PROVINCE',
accountType: 'CITY', regionCode: provinceCode || null,
rate: DistributionRate.CITY, rate: DistributionRate.PROVINCE,
amount: totalContribution.multiply(DistributionRate.CITY.value), amount: totalContribution.multiply(DistributionRate.PROVINCE.value),
}, });
];
// 市公司账户 - 2%(按认种选择的城市)
const cityCode = adoption.selectedCity;
result.systemContributions.push({
accountType: 'CITY',
regionCode: cityCode || null,
rate: DistributionRate.CITY,
amount: totalContribution.multiply(DistributionRate.CITY.value),
});
// 3. 团队贡献值 (15%) // 3. 团队贡献值 (15%)
this.distributeTeamContribution( this.distributeTeamContribution(

View File

@ -13,11 +13,11 @@ import { KafkaModule } from './kafka/kafka.module';
import { KafkaProducerService } from './kafka/kafka-producer.service'; import { KafkaProducerService } from './kafka/kafka-producer.service';
import { CDCConsumerService } from './kafka/cdc-consumer.service'; import { CDCConsumerService } from './kafka/cdc-consumer.service';
import { RedisModule } from './redis/redis.module'; import { RedisModule } from './redis/redis.module';
import { SYNCED_DATA_REPOSITORY } from '../domain/repositories/synced-data.repository.interface';
// Repository injection tokens // Repository injection tokens
export const CONTRIBUTION_ACCOUNT_REPOSITORY = 'CONTRIBUTION_ACCOUNT_REPOSITORY'; export const CONTRIBUTION_ACCOUNT_REPOSITORY = 'CONTRIBUTION_ACCOUNT_REPOSITORY';
export const CONTRIBUTION_RECORD_REPOSITORY = 'CONTRIBUTION_RECORD_REPOSITORY'; export const CONTRIBUTION_RECORD_REPOSITORY = 'CONTRIBUTION_RECORD_REPOSITORY';
export const SYNCED_DATA_REPOSITORY = 'SYNCED_DATA_REPOSITORY';
@Module({ @Module({
imports: [PrismaModule, KafkaModule, RedisModule], imports: [PrismaModule, KafkaModule, RedisModule],

View File

@ -53,6 +53,12 @@ export type TransactionalCDCHandlerWithResult<T> = (event: CDCEvent, tx: Transac
/** 事务提交后的回调函数 */ /** 事务提交后的回调函数 */
export type PostCommitCallback<T> = (result: T) => Promise<void>; export type PostCommitCallback<T> = (result: T) => Promise<void>;
/** Topic 消费阶段配置 */
export interface TopicPhase {
topic: string;
tableName: string;
}
@Injectable() @Injectable()
export class CDCConsumerService implements OnModuleInit, OnModuleDestroy { export class CDCConsumerService implements OnModuleInit, OnModuleDestroy {
private readonly logger = new Logger(CDCConsumerService.name); private readonly logger = new Logger(CDCConsumerService.name);
@ -61,6 +67,14 @@ export class CDCConsumerService implements OnModuleInit, OnModuleDestroy {
private handlers: Map<string, CDCHandler> = new Map(); private handlers: Map<string, CDCHandler> = new Map();
private isRunning = false; private isRunning = false;
// 分阶段消费配置
private topicPhases: TopicPhase[] = [];
private currentPhaseIndex = 0;
private sequentialMode = false;
// 初始同步完成标记(只有顺序同步全部完成后才为 true
private initialSyncCompleted = false;
constructor( constructor(
private readonly configService: ConfigService, private readonly configService: ConfigService,
private readonly prisma: PrismaService, private readonly prisma: PrismaService,
@ -247,7 +261,14 @@ export class CDCConsumerService implements OnModuleInit, OnModuleDestroy {
} }
/** /**
* *
*
* topic
* 1. (user_accounts)
* 2. (referral_relationships) -
* 3. (planting_orders) -
*
*
*/ */
async start(): Promise<void> { async start(): Promise<void> {
if (this.isRunning) { if (this.isRunning) {
@ -259,36 +280,213 @@ export class CDCConsumerService implements OnModuleInit, OnModuleDestroy {
await this.consumer.connect(); await this.consumer.connect();
this.logger.log('CDC consumer connected'); this.logger.log('CDC consumer connected');
// 订阅 Debezium CDC topics (从1.0服务全量同步) // 配置顺序消费阶段(顺序很重要!)
const topics = [ this.topicPhases = [
// 用户账户表 (identity-service: user_accounts) {
this.configService.get<string>('CDC_TOPIC_USERS', 'cdc.identity.public.user_accounts'), topic: this.configService.get<string>('CDC_TOPIC_USERS', 'cdc.identity.public.user_accounts'),
// 认种订单表 (planting-service: planting_orders) tableName: 'user_accounts',
this.configService.get<string>('CDC_TOPIC_ADOPTIONS', 'cdc.planting.public.planting_orders'), },
// 推荐关系表 (referral-service: referral_relationships) {
this.configService.get<string>('CDC_TOPIC_REFERRALS', 'cdc.referral.public.referral_relationships'), topic: this.configService.get<string>('CDC_TOPIC_REFERRALS', 'cdc.referral.public.referral_relationships'),
tableName: 'referral_relationships',
},
{
topic: this.configService.get<string>('CDC_TOPIC_ADOPTIONS', 'cdc.planting.public.planting_orders'),
tableName: 'planting_orders',
},
]; ];
await this.consumer.subscribe({ this.currentPhaseIndex = 0;
topics, this.sequentialMode = true;
fromBeginning: true, // 首次启动时全量同步历史数据
});
this.logger.log(`Subscribed to topics: ${topics.join(', ')}`);
await this.consumer.run({
eachMessage: async (payload: EachMessagePayload) => {
await this.handleMessage(payload);
},
});
this.isRunning = true; this.isRunning = true;
this.logger.log('CDC consumer started with transactional idempotency protection');
// 开始顺序消费(阻塞直到完成,确保数据依赖顺序正确)
await this.startSequentialConsumption();
this.logger.log('CDC consumer started with sequential phase consumption');
} catch (error) { } catch (error) {
this.logger.error('Failed to start CDC consumer', error); this.logger.error('Failed to start CDC consumer', error);
// 不抛出错误,允许服务在没有 Kafka 的情况下启动(用于本地开发) // 不抛出错误,允许服务在没有 Kafka 的情况下启动(用于本地开发)
} }
} }
/**
*
*/
private async startSequentialConsumption(): Promise<void> {
for (let i = 0; i < this.topicPhases.length; i++) {
this.currentPhaseIndex = i;
const phase = this.topicPhases[i];
this.logger.log(`[CDC] Starting phase ${i + 1}/${this.topicPhases.length}: ${phase.tableName} (${phase.topic})`);
// 消费当前阶段直到追上最新
await this.consumePhaseToEnd(phase);
this.logger.log(`[CDC] Completed phase ${i + 1}/${this.topicPhases.length}: ${phase.tableName}`);
}
this.logger.log('[CDC] All phases completed. Switching to continuous mode...');
// 所有阶段完成后,切换到持续消费模式(同时监听所有 topic
await this.startContinuousMode();
}
/**
*
*/
private async consumePhaseToEnd(phase: TopicPhase): Promise<void> {
const admin = this.kafka.admin();
await admin.connect();
// 获取 topic 的高水位线和最早 offset
const topicOffsets = await admin.fetchTopicOffsets(phase.topic);
const highWatermarks: Map<number, string> = new Map();
const earliestOffsets: Map<number, string> = new Map();
for (const partitionOffset of topicOffsets) {
highWatermarks.set(partitionOffset.partition, partitionOffset.high);
earliestOffsets.set(partitionOffset.partition, partitionOffset.low);
}
this.logger.log(`[CDC] Phase ${phase.tableName}: High watermarks = ${JSON.stringify(Object.fromEntries(highWatermarks))}`);
// 检查是否 topic 为空
const allEmpty = Array.from(highWatermarks.values()).every(hw => hw === '0');
if (allEmpty) {
this.logger.log(`[CDC] Phase ${phase.tableName}: Topic is empty, skipping`);
await admin.disconnect();
return;
}
// 使用固定的 group id
const phaseGroupId = `contribution-service-cdc-phase-${phase.tableName}`;
// 重置 consumer group 的 offset 到最早位置
// 使用 admin.resetOffsets 而不是 setOffsets更简洁且专门用于重置到 earliest/latest
// 这确保每次服务启动都会从头开始消费,不受之前 committed offset 影响
// 参考: https://kafka.js.org/docs/admin#a-name-reset-offsets-a-resetoffsets
this.logger.log(`[CDC] Phase ${phase.tableName}: Resetting consumer group ${phaseGroupId} offsets to earliest`);
try {
await admin.resetOffsets({
groupId: phaseGroupId,
topic: phase.topic,
earliest: true,
});
this.logger.log(`[CDC] Phase ${phase.tableName}: Consumer group offsets reset successfully`);
} catch (resetError: any) {
// 如果 consumer group 不存在resetOffsets 会失败,这是正常的(首次运行)
// fromBeginning: true 会在这种情况下生效
this.logger.log(`[CDC] Phase ${phase.tableName}: Could not reset offsets (may be first run): ${resetError.message}`);
}
const phaseConsumer = this.kafka.consumer({
groupId: phaseGroupId,
});
try {
await phaseConsumer.connect();
// 订阅单个 topicfromBeginning 对新 group 有效
await phaseConsumer.subscribe({
topic: phase.topic,
fromBeginning: true,
});
let processedOffsets: Map<number, bigint> = new Map();
let isComplete = false;
for (const partition of highWatermarks.keys()) {
processedOffsets.set(partition, BigInt(-1));
}
// 开始消费
await phaseConsumer.run({
eachMessage: async (payload: EachMessagePayload) => {
await this.handleMessage(payload);
// 更新已处理的 offset
processedOffsets.set(payload.partition, BigInt(payload.message.offset));
// 检查是否所有 partition 都已追上高水位线
let allCaughtUp = true;
for (const [partition, highWatermark] of highWatermarks) {
const processed = processedOffsets.get(partition) ?? BigInt(-1);
// 高水位线是下一个将被写入的 offset所以已处理的 offset 需要 >= highWatermark - 1
if (processed < BigInt(highWatermark) - BigInt(1)) {
allCaughtUp = false;
break;
}
}
if (allCaughtUp && !isComplete) {
isComplete = true;
this.logger.log(`[CDC] Phase ${phase.tableName}: Caught up with all partitions`);
}
},
});
// 等待追上高水位线
while (!isComplete) {
await new Promise(resolve => setTimeout(resolve, 100));
// 每秒检查一次进度
const currentProgress = Array.from(processedOffsets.entries())
.map(([p, o]) => `P${p}:${o}/${highWatermarks.get(p)}`)
.join(', ');
this.logger.debug(`[CDC] Phase ${phase.tableName} progress: ${currentProgress}`);
}
// 停止消费
await phaseConsumer.stop();
await phaseConsumer.disconnect();
await admin.disconnect();
} catch (error) {
this.logger.error(`[CDC] Error in phase ${phase.tableName}`, error);
await phaseConsumer.disconnect();
await admin.disconnect();
throw error;
}
}
/**
* topic
*/
private async startContinuousMode(): Promise<void> {
this.sequentialMode = false;
this.initialSyncCompleted = true; // 标记初始同步完成
const topics = this.topicPhases.map(p => p.topic);
await this.consumer.subscribe({
topics,
fromBeginning: false, // 从上次消费的位置继续(不是从头开始)
});
this.logger.log(`[CDC] Continuous mode: Subscribed to topics: ${topics.join(', ')}`);
await this.consumer.run({
eachMessage: async (payload: EachMessagePayload) => {
await this.handleMessage(payload);
},
});
this.logger.log('[CDC] Continuous mode started - all topics being consumed in parallel');
}
/**
* CDC
* - initialSyncCompleted = true:
*/
getSyncStatus(): { isRunning: boolean; sequentialMode: boolean; allPhasesCompleted: boolean } {
return {
isRunning: this.isRunning,
sequentialMode: this.sequentialMode,
allPhasesCompleted: this.initialSyncCompleted,
};
}
/** /**
* *
*/ */

View File

@ -223,6 +223,117 @@ export class ContributionAccountRepository implements IContributionAccountReposi
}); });
} }
async findRecentlyUpdated(since: Date, limit: number = 500): Promise<ContributionAccountAggregate[]> {
const records = await this.client.contributionAccount.findMany({
where: { updatedAt: { gte: since } },
orderBy: { updatedAt: 'desc' },
take: limit,
});
return records.map((r) => this.toDomain(r));
}
/**
*
*/
async getDetailedContributionStats(): Promise<{
// 个人算力总计
personalTotal: string;
// 层级算力 - 已解锁(已分配给上线)
levelUnlocked: string;
// 层级算力 - 未解锁待解锁的pending
levelPending: string;
// 层级按档位分解
levelByTier: {
tier1: { unlocked: string; pending: string }; // 1-5级
tier2: { unlocked: string; pending: string }; // 6-10级
tier3: { unlocked: string; pending: string }; // 11-15级
};
// 团队奖励算力 - 已解锁
bonusUnlocked: string;
// 团队奖励算力 - 未解锁
bonusPending: string;
// 团队奖励按档位分解
bonusByTier: {
tier1: { unlocked: string; pending: string };
tier2: { unlocked: string; pending: string };
tier3: { unlocked: string; pending: string };
};
}> {
const result = await this.client.contributionAccount.aggregate({
_sum: {
personalContribution: true,
// 层级 1-5
level1Pending: true,
level2Pending: true,
level3Pending: true,
level4Pending: true,
level5Pending: true,
// 层级 6-10
level6Pending: true,
level7Pending: true,
level8Pending: true,
level9Pending: true,
level10Pending: true,
// 层级 11-15
level11Pending: true,
level12Pending: true,
level13Pending: true,
level14Pending: true,
level15Pending: true,
// 团队奖励
bonusTier1Pending: true,
bonusTier2Pending: true,
bonusTier3Pending: true,
// 汇总
totalLevelPending: true,
totalBonusPending: true,
totalUnlocked: true,
},
});
const sum = result._sum;
// 层级 1-5 已解锁在pending字段中存储的是已分配给该用户的层级算力
const level1to5 = new Decimal(sum.level1Pending || 0)
.plus(sum.level2Pending || 0)
.plus(sum.level3Pending || 0)
.plus(sum.level4Pending || 0)
.plus(sum.level5Pending || 0);
// 层级 6-10
const level6to10 = new Decimal(sum.level6Pending || 0)
.plus(sum.level7Pending || 0)
.plus(sum.level8Pending || 0)
.plus(sum.level9Pending || 0)
.plus(sum.level10Pending || 0);
// 层级 11-15
const level11to15 = new Decimal(sum.level11Pending || 0)
.plus(sum.level12Pending || 0)
.plus(sum.level13Pending || 0)
.plus(sum.level14Pending || 0)
.plus(sum.level15Pending || 0);
return {
personalTotal: (sum.personalContribution || new Decimal(0)).toString(),
levelUnlocked: (sum.totalLevelPending || new Decimal(0)).toString(),
levelPending: '0', // 未解锁的存储在 unallocated 表中
levelByTier: {
tier1: { unlocked: level1to5.toString(), pending: '0' },
tier2: { unlocked: level6to10.toString(), pending: '0' },
tier3: { unlocked: level11to15.toString(), pending: '0' },
},
bonusUnlocked: (sum.totalBonusPending || new Decimal(0)).toString(),
bonusPending: '0', // 未解锁的存储在 unallocated 表中
bonusByTier: {
tier1: { unlocked: (sum.bonusTier1Pending || new Decimal(0)).toString(), pending: '0' },
tier2: { unlocked: (sum.bonusTier2Pending || new Decimal(0)).toString(), pending: '0' },
tier3: { unlocked: (sum.bonusTier3Pending || new Decimal(0)).toString(), pending: '0' },
},
};
}
private toDomain(record: any): ContributionAccountAggregate { private toDomain(record: any): ContributionAccountAggregate {
return ContributionAccountAggregate.fromPersistence({ return ContributionAccountAggregate.fromPersistence({
id: record.id, id: record.id,

View File

@ -136,7 +136,10 @@ export class SyncedDataRepository implements ISyncedDataRepository {
async findUndistributedAdoptions(limit: number = 100): Promise<SyncedAdoption[]> { async findUndistributedAdoptions(limit: number = 100): Promise<SyncedAdoption[]> {
const records = await this.client.syncedAdoption.findMany({ const records = await this.client.syncedAdoption.findMany({
where: { contributionDistributed: false }, where: {
contributionDistributed: false,
status: 'MINING_ENABLED', // 只处理最终成功的认种订单
},
orderBy: { adoptionDate: 'asc' }, orderBy: { adoptionDate: 'asc' },
take: limit, take: limit,
}); });
@ -171,7 +174,10 @@ export class SyncedDataRepository implements ISyncedDataRepository {
async getTotalTreesByAccountSequence(accountSequence: string): Promise<number> { async getTotalTreesByAccountSequence(accountSequence: string): Promise<number> {
const result = await this.client.syncedAdoption.aggregate({ const result = await this.client.syncedAdoption.aggregate({
where: { accountSequence }, where: {
accountSequence,
status: 'MINING_ENABLED', // 只统计最终成功的认种订单
},
_sum: { treeCount: true }, _sum: { treeCount: true },
}); });
return result._sum.treeCount ?? 0; return result._sum.treeCount ?? 0;
@ -285,8 +291,12 @@ export class SyncedDataRepository implements ISyncedDataRepository {
const accountSequences = directReferrals.map((r) => r.accountSequence); const accountSequences = directReferrals.map((r) => r.accountSequence);
// 只统计有 MINING_ENABLED 状态认种记录的直推用户数
const adoptedCount = await this.client.syncedAdoption.findMany({ const adoptedCount = await this.client.syncedAdoption.findMany({
where: { accountSequence: { in: accountSequences } }, where: {
accountSequence: { in: accountSequences },
status: 'MINING_ENABLED', // 只统计最终成功的认种订单
},
distinct: ['accountSequence'], distinct: ['accountSequence'],
}); });
@ -308,7 +318,10 @@ export class SyncedDataRepository implements ISyncedDataRepository {
const adoptions = await this.client.syncedAdoption.groupBy({ const adoptions = await this.client.syncedAdoption.groupBy({
by: ['accountSequence'], by: ['accountSequence'],
where: { accountSequence: { in: sequences } }, where: {
accountSequence: { in: sequences },
status: 'MINING_ENABLED', // 只统计最终成功的认种订单
},
_sum: { treeCount: true }, _sum: { treeCount: true },
}); });
@ -346,6 +359,89 @@ export class SyncedDataRepository implements ISyncedDataRepository {
return result; return result;
} }
// ========== 认种分类账查询 ==========
async getPlantingLedger(
accountSequence: string,
page: number = 1,
pageSize: number = 20,
): Promise<{
items: SyncedAdoption[];
total: number;
page: number;
pageSize: number;
totalPages: number;
}> {
const skip = (page - 1) * pageSize;
// 只返回 MINING_ENABLED 状态的认种记录
const whereClause = { accountSequence, status: 'MINING_ENABLED' };
const [items, total] = await Promise.all([
this.client.syncedAdoption.findMany({
where: whereClause,
orderBy: { adoptionDate: 'desc' },
skip,
take: pageSize,
}),
this.client.syncedAdoption.count({
where: whereClause,
}),
]);
return {
items: items.map((r) => this.toSyncedAdoption(r)),
total,
page,
pageSize,
totalPages: Math.ceil(total / pageSize),
};
}
async getPlantingSummary(accountSequence: string): Promise<{
totalOrders: number;
totalTreeCount: number;
totalAmount: string;
effectiveTreeCount: number;
firstPlantingAt: Date | null;
lastPlantingAt: Date | null;
}> {
// 只统计 MINING_ENABLED 状态的认种记录
const adoptions = await this.client.syncedAdoption.findMany({
where: { accountSequence, status: 'MINING_ENABLED' },
orderBy: { adoptionDate: 'asc' },
});
if (adoptions.length === 0) {
return {
totalOrders: 0,
totalTreeCount: 0,
totalAmount: '0',
effectiveTreeCount: 0,
firstPlantingAt: null,
lastPlantingAt: null,
};
}
const totalOrders = adoptions.length;
const totalTreeCount = adoptions.reduce((sum, a) => sum + a.treeCount, 0);
// 计算总金额treeCount * contributionPerTree
let totalAmount = new Decimal(0);
for (const adoption of adoptions) {
const amount = new Decimal(adoption.contributionPerTree).mul(adoption.treeCount);
totalAmount = totalAmount.add(amount);
}
return {
totalOrders,
totalTreeCount,
totalAmount: totalAmount.toString(),
effectiveTreeCount: totalTreeCount, // 全部都是有效的 MINING_ENABLED
firstPlantingAt: adoptions[0]?.adoptionDate || null,
lastPlantingAt: adoptions[adoptions.length - 1]?.adoptionDate || null,
};
}
// ========== 统计方法(用于查询服务)========== // ========== 统计方法(用于查询服务)==========
async countUsers(): Promise<number> { async countUsers(): Promise<number> {
@ -358,10 +454,23 @@ export class SyncedDataRepository implements ISyncedDataRepository {
async countUndistributedAdoptions(): Promise<number> { async countUndistributedAdoptions(): Promise<number> {
return this.client.syncedAdoption.count({ return this.client.syncedAdoption.count({
where: { contributionDistributed: false }, where: {
contributionDistributed: false,
status: 'MINING_ENABLED', // 只统计最终成功的认种订单
},
}); });
} }
async getTotalTrees(): Promise<number> {
const result = await this.client.syncedAdoption.aggregate({
where: {
status: 'MINING_ENABLED', // 只统计最终成功的认种订单
},
_sum: { treeCount: true },
});
return result._sum.treeCount ?? 0;
}
// ========== 私有方法 ========== // ========== 私有方法 ==========
private toSyncedUser(record: any): SyncedUser { private toSyncedUser(record: any): SyncedUser {

View File

@ -7,6 +7,7 @@ export type SystemAccountType = 'OPERATION' | 'PROVINCE' | 'CITY' | 'HEADQUARTER
export interface SystemAccount { export interface SystemAccount {
id: bigint; id: bigint;
accountType: SystemAccountType; accountType: SystemAccountType;
regionCode: string | null; // 省/市代码
name: string; name: string;
contributionBalance: ContributionAmount; contributionBalance: ContributionAmount;
contributionNeverExpires: boolean; contributionNeverExpires: boolean;
@ -20,6 +21,8 @@ export interface SystemContributionRecord {
systemAccountId: bigint; systemAccountId: bigint;
sourceAdoptionId: bigint; sourceAdoptionId: bigint;
sourceAccountSequence: string; sourceAccountSequence: string;
sourceType: string; // 来源类型FIXED_RATE / LEVEL_OVERFLOW / LEVEL_NO_ANCESTOR / BONUS_TIER_1/2/3
levelDepth: number | null; // 层级深度1-15仅对层级相关类型有效
distributionRate: number; distributionRate: number;
amount: ContributionAmount; amount: ContributionAmount;
effectiveDate: Date; effectiveDate: Date;
@ -36,9 +39,19 @@ export class SystemAccountRepository {
return this.unitOfWork.getClient(); return this.unitOfWork.getClient();
} }
async findByType(accountType: SystemAccountType): Promise<SystemAccount | null> { /**
const record = await this.client.systemAccount.findUnique({ * accountType + regionCode
where: { accountType }, * regionCode 使 findFirst findUnique
*/
async findByTypeAndRegion(
accountType: SystemAccountType,
regionCode: string | null,
): Promise<SystemAccount | null> {
const record = await this.client.systemAccount.findFirst({
where: {
accountType,
regionCode: regionCode === null ? { equals: null } : regionCode,
},
}); });
if (!record) { if (!record) {
@ -48,123 +61,225 @@ export class SystemAccountRepository {
return this.toSystemAccount(record); return this.toSystemAccount(record);
} }
async findAll(): Promise<SystemAccount[]> { /**
* CITY
*/
async findByType(accountType: SystemAccountType): Promise<SystemAccount[]> {
const records = await this.client.systemAccount.findMany({ const records = await this.client.systemAccount.findMany({
orderBy: { accountType: 'asc' }, where: { accountType },
orderBy: { regionCode: 'asc' },
}); });
return records.map((r) => this.toSystemAccount(r)); return records.map((r) => this.toSystemAccount(r));
} }
async findAll(): Promise<SystemAccount[]> {
const records = await this.client.systemAccount.findMany({
orderBy: [{ accountType: 'asc' }, { regionCode: 'asc' }],
});
return records.map((r) => this.toSystemAccount(r));
}
/**
*
*/
async ensureSystemAccountsExist(): Promise<void> { async ensureSystemAccountsExist(): Promise<void> {
const accounts: { accountType: SystemAccountType; name: string }[] = [ const accounts: { accountType: SystemAccountType; name: string }[] = [
{ accountType: 'OPERATION', name: '运营账户' }, { accountType: 'OPERATION', name: '运营账户' },
{ accountType: 'PROVINCE', name: '省公司账户' },
{ accountType: 'CITY', name: '市公司账户' },
{ accountType: 'HEADQUARTERS', name: '总部账户' }, { accountType: 'HEADQUARTERS', name: '总部账户' },
]; ];
for (const account of accounts) { for (const account of accounts) {
await this.client.systemAccount.upsert({ // 由于 regionCode 是 nullable使用 findFirst + create 替代 upsert
where: { accountType: account.accountType }, const existing = await this.client.systemAccount.findFirst({
create: { where: {
accountType: account.accountType, accountType: account.accountType,
name: account.name, regionCode: { equals: null },
contributionBalance: 0, },
});
if (!existing) {
await this.client.systemAccount.create({
data: {
accountType: account.accountType,
regionCode: null,
name: account.name,
contributionBalance: 0,
contributionNeverExpires: true,
},
});
}
}
}
/**
*
*/
async addContribution(
accountType: SystemAccountType,
regionCode: string | null,
amount: ContributionAmount,
): Promise<void> {
const name = this.getAccountName(accountType, regionCode);
// 由于 regionCode 是 nullable使用 findFirst + create/update 替代 upsert
const existing = await this.client.systemAccount.findFirst({
where: {
accountType,
regionCode: regionCode === null ? { equals: null } : regionCode,
},
});
if (existing) {
await this.client.systemAccount.update({
where: { id: existing.id },
data: {
contributionBalance: { increment: amount.value },
},
});
} else {
await this.client.systemAccount.create({
data: {
accountType,
regionCode,
name,
contributionBalance: amount.value,
contributionNeverExpires: true,
}, },
update: {},
}); });
} }
} }
async addContribution( /**
*
*/
private getAccountName(accountType: SystemAccountType, regionCode: string | null): string {
if (!regionCode) {
const names: Record<SystemAccountType, string> = {
OPERATION: '运营账户',
PROVINCE: '省公司账户',
CITY: '市公司账户',
HEADQUARTERS: '总部账户',
};
return names[accountType] || accountType;
}
return `${regionCode}账户`;
}
/**
*
*/
async subtractContribution(
accountType: SystemAccountType, accountType: SystemAccountType,
regionCode: string | null,
amount: ContributionAmount, amount: ContributionAmount,
): Promise<void> { ): Promise<void> {
await this.client.systemAccount.update({ const existing = await this.client.systemAccount.findFirst({
where: { accountType }, where: {
data: { accountType,
contributionBalance: { increment: amount.value }, regionCode: regionCode === null ? { equals: null } : regionCode,
}, },
}); });
if (existing) {
await this.client.systemAccount.update({
where: { id: existing.id },
data: {
contributionBalance: { decrement: amount.value },
},
});
}
}
/**
*
*/
async deleteContributionRecordsByAdoption(
accountType: SystemAccountType,
regionCode: string | null,
sourceAdoptionId: bigint,
sourceAccountSequence: string,
): Promise<number> {
const systemAccount = await this.findByTypeAndRegion(accountType, regionCode);
if (!systemAccount) {
return 0;
}
const result = await this.client.systemContributionRecord.updateMany({
where: {
systemAccountId: systemAccount.id,
sourceAdoptionId,
sourceAccountSequence,
deletedAt: null, // 只软删除未删除的记录
},
data: {
deletedAt: new Date(),
},
});
return result.count;
} }
async saveContributionRecord(record: { async saveContributionRecord(record: {
systemAccountType: SystemAccountType; accountType: SystemAccountType;
regionCode: string | null;
sourceAdoptionId: bigint; sourceAdoptionId: bigint;
sourceAccountSequence: string; sourceAccountSequence: string;
sourceType: string; // 来源类型
levelDepth?: number | null; // 层级深度
distributionRate: number; distributionRate: number;
amount: ContributionAmount; amount: ContributionAmount;
effectiveDate: Date; effectiveDate: Date;
expireDate?: Date | null; expireDate?: Date | null;
}): Promise<void> { }): Promise<SystemContributionRecord> {
const systemAccount = await this.findByType(record.systemAccountType); const systemAccount = await this.findByTypeAndRegion(record.accountType, record.regionCode);
if (!systemAccount) { if (!systemAccount) {
throw new Error(`System account ${record.systemAccountType} not found`); throw new Error(`System account ${record.accountType}:${record.regionCode} not found`);
} }
await this.client.systemContributionRecord.create({ const created = await this.client.systemContributionRecord.create({
data: { data: {
systemAccountId: systemAccount.id, systemAccountId: systemAccount.id,
sourceAdoptionId: record.sourceAdoptionId, sourceAdoptionId: record.sourceAdoptionId,
sourceAccountSequence: record.sourceAccountSequence, sourceAccountSequence: record.sourceAccountSequence,
sourceType: record.sourceType,
levelDepth: record.levelDepth ?? null,
distributionRate: record.distributionRate, distributionRate: record.distributionRate,
amount: record.amount.value, amount: record.amount.value,
effectiveDate: record.effectiveDate, effectiveDate: record.effectiveDate,
expireDate: record.expireDate ?? null, expireDate: record.expireDate ?? null,
}, },
}); });
}
async saveContributionRecords(records: { return this.toContributionRecord(created);
systemAccountType: SystemAccountType;
sourceAdoptionId: bigint;
sourceAccountSequence: string;
distributionRate: number;
amount: ContributionAmount;
effectiveDate: Date;
expireDate?: Date | null;
}[]): Promise<void> {
if (records.length === 0) return;
const systemAccounts = await this.findAll();
const accountMap = new Map<SystemAccountType, bigint>();
for (const account of systemAccounts) {
accountMap.set(account.accountType, account.id);
}
await this.client.systemContributionRecord.createMany({
data: records.map((r) => ({
systemAccountId: accountMap.get(r.systemAccountType)!,
sourceAdoptionId: r.sourceAdoptionId,
sourceAccountSequence: r.sourceAccountSequence,
distributionRate: r.distributionRate,
amount: r.amount.value,
effectiveDate: r.effectiveDate,
expireDate: r.expireDate ?? null,
})),
});
} }
async findContributionRecords( async findContributionRecords(
systemAccountType: SystemAccountType, accountType: SystemAccountType,
regionCode: string | null,
page: number, page: number,
pageSize: number, pageSize: number,
): Promise<{ data: SystemContributionRecord[]; total: number }> { ): Promise<{ data: SystemContributionRecord[]; total: number }> {
const systemAccount = await this.findByType(systemAccountType); const systemAccount = await this.findByTypeAndRegion(accountType, regionCode);
if (!systemAccount) { if (!systemAccount) {
return { data: [], total: 0 }; return { data: [], total: 0 };
} }
const whereClause = {
systemAccountId: systemAccount.id,
deletedAt: null, // 过滤已软删除的记录
};
const [records, total] = await Promise.all([ const [records, total] = await Promise.all([
this.client.systemContributionRecord.findMany({ this.client.systemContributionRecord.findMany({
where: { systemAccountId: systemAccount.id }, where: whereClause,
skip: (page - 1) * pageSize, skip: (page - 1) * pageSize,
take: pageSize, take: pageSize,
orderBy: { createdAt: 'desc' }, orderBy: { createdAt: 'desc' },
}), }),
this.client.systemContributionRecord.count({ this.client.systemContributionRecord.count({
where: { systemAccountId: systemAccount.id }, where: whereClause,
}), }),
]); ]);
@ -178,6 +293,7 @@ export class SystemAccountRepository {
return { return {
id: record.id, id: record.id,
accountType: record.accountType as SystemAccountType, accountType: record.accountType as SystemAccountType,
regionCode: record.regionCode,
name: record.name, name: record.name,
contributionBalance: new ContributionAmount(record.contributionBalance), contributionBalance: new ContributionAmount(record.contributionBalance),
contributionNeverExpires: record.contributionNeverExpires, contributionNeverExpires: record.contributionNeverExpires,
@ -193,6 +309,8 @@ export class SystemAccountRepository {
systemAccountId: record.systemAccountId, systemAccountId: record.systemAccountId,
sourceAdoptionId: record.sourceAdoptionId, sourceAdoptionId: record.sourceAdoptionId,
sourceAccountSequence: record.sourceAccountSequence, sourceAccountSequence: record.sourceAccountSequence,
sourceType: record.sourceType,
levelDepth: record.levelDepth,
distributionRate: record.distributionRate, distributionRate: record.distributionRate,
amount: new ContributionAmount(record.amount), amount: new ContributionAmount(record.amount),
effectiveDate: record.effectiveDate, effectiveDate: record.effectiveDate,

View File

@ -7,14 +7,16 @@ export interface UnallocatedContribution {
unallocType: string; unallocType: string;
wouldBeAccountSequence: string | null; wouldBeAccountSequence: string | null;
levelDepth: number | null; levelDepth: number | null;
bonusTier: number | null;
amount: ContributionAmount; amount: ContributionAmount;
reason: string | null; reason: string | null;
sourceAdoptionId: bigint; sourceAdoptionId: bigint;
sourceAccountSequence: string; sourceAccountSequence: string;
effectiveDate: Date; effectiveDate: Date;
expireDate: Date; expireDate: Date;
allocatedToHeadquarters: boolean; status: string;
allocatedAt: Date | null; allocatedAt: Date | null;
allocatedToAccountSequence: string | null;
createdAt: Date; createdAt: Date;
} }
@ -130,20 +132,157 @@ export class UnallocatedContributionRepository {
}; };
} }
/**
*
* @param accountSequence
* @param bonusTier (2 3)
*/
async findPendingBonusByAccountSequence(
accountSequence: string,
bonusTier: number,
): Promise<UnallocatedContribution[]> {
const records = await this.client.unallocatedContribution.findMany({
where: {
wouldBeAccountSequence: accountSequence,
unallocType: `BONUS_TIER_${bonusTier}`,
status: 'PENDING',
},
orderBy: { createdAt: 'asc' },
});
return records.map((r) => this.toDomain(r));
}
/**
* -
* @param ids ID列表
* @param accountSequence
*/
async claimBonusRecords(ids: bigint[], accountSequence: string): Promise<void> {
if (ids.length === 0) return;
await this.client.unallocatedContribution.updateMany({
where: {
id: { in: ids },
status: 'PENDING',
},
data: {
status: 'ALLOCATED_TO_USER',
allocatedAt: new Date(),
allocatedToAccountSequence: accountSequence,
},
});
}
/**
*
*/
async findAllPendingBonusByAccountSequence(
accountSequence: string,
): Promise<UnallocatedContribution[]> {
const records = await this.client.unallocatedContribution.findMany({
where: {
wouldBeAccountSequence: accountSequence,
unallocType: { startsWith: 'BONUS_TIER_' },
status: 'PENDING',
},
orderBy: { createdAt: 'asc' },
});
return records.map((r) => this.toDomain(r));
}
/**
*
*/
async getUnallocatedByLevelTier(): Promise<{
tier1: string; // 1-5级未分配
tier2: string; // 6-10级未分配
tier3: string; // 11-15级未分配
}> {
const results = await this.client.unallocatedContribution.groupBy({
by: ['levelDepth'],
where: {
levelDepth: { not: null },
status: 'PENDING',
},
_sum: { amount: true },
});
let tier1 = new ContributionAmount(0);
let tier2 = new ContributionAmount(0);
let tier3 = new ContributionAmount(0);
for (const item of results) {
const depth = item.levelDepth!;
const amount = new ContributionAmount(item._sum.amount || 0);
if (depth >= 1 && depth <= 5) {
tier1 = tier1.add(amount);
} else if (depth >= 6 && depth <= 10) {
tier2 = tier2.add(amount);
} else if (depth >= 11 && depth <= 15) {
tier3 = tier3.add(amount);
}
}
return {
tier1: tier1.value.toString(),
tier2: tier2.value.toString(),
tier3: tier3.value.toString(),
};
}
/**
*
*/
async getUnallocatedByBonusTier(): Promise<{
tier1: string;
tier2: string;
tier3: string;
}> {
const results = await this.client.unallocatedContribution.groupBy({
by: ['unallocType'],
where: {
unallocType: { startsWith: 'BONUS_TIER_' },
status: 'PENDING',
},
_sum: { amount: true },
});
let tier1 = '0';
let tier2 = '0';
let tier3 = '0';
for (const item of results) {
const amount = (item._sum.amount || 0).toString();
if (item.unallocType === 'BONUS_TIER_1') {
tier1 = amount;
} else if (item.unallocType === 'BONUS_TIER_2') {
tier2 = amount;
} else if (item.unallocType === 'BONUS_TIER_3') {
tier3 = amount;
}
}
return { tier1, tier2, tier3 };
}
private toDomain(record: any): UnallocatedContribution { private toDomain(record: any): UnallocatedContribution {
return { return {
id: record.id, id: record.id,
unallocType: record.unallocType, unallocType: record.unallocType,
wouldBeAccountSequence: record.wouldBeAccountSequence, wouldBeAccountSequence: record.wouldBeAccountSequence,
levelDepth: record.levelDepth, levelDepth: record.levelDepth,
bonusTier: record.bonusTier,
amount: new ContributionAmount(record.amount), amount: new ContributionAmount(record.amount),
reason: record.reason, reason: record.reason,
sourceAdoptionId: record.sourceAdoptionId, sourceAdoptionId: record.sourceAdoptionId,
sourceAccountSequence: record.sourceAccountSequence, sourceAccountSequence: record.sourceAccountSequence,
effectiveDate: record.effectiveDate, effectiveDate: record.effectiveDate,
expireDate: record.expireDate, expireDate: record.expireDate,
allocatedToHeadquarters: record.allocatedToHeadquarters, status: record.status,
allocatedAt: record.allocatedAt, allocatedAt: record.allocatedAt,
allocatedToAccountSequence: record.allocatedToAccountSequence,
createdAt: record.createdAt, createdAt: record.createdAt,
}; };
} }

View File

@ -25,6 +25,7 @@
# CDC & Sync: # CDC & Sync:
# ./deploy-mining.sh sync-reset # Reset CDC consumer offsets to beginning # ./deploy-mining.sh sync-reset # Reset CDC consumer offsets to beginning
# ./deploy-mining.sh sync-status # Show CDC consumer group status # ./deploy-mining.sh sync-status # Show CDC consumer group status
# ./deploy-mining.sh cdc-resnapshot # Force Debezium to re-snapshot (use when Kafka data lost)
# #
# Full Reset (for development/testing): # Full Reset (for development/testing):
# ./deploy-mining.sh full-reset # Complete reset: stop services, drop DBs, recreate, resync # ./deploy-mining.sh full-reset # Complete reset: stop services, drop DBs, recreate, resync
@ -103,8 +104,13 @@ declare -A SERVICE_PORTS=(
) )
# CDC Consumer Groups (all groups that need to be reset during full-reset) # CDC Consumer Groups (all groups that need to be reset during full-reset)
# NOTE: contribution-service uses sequential phase consumption with separate consumer groups
# for each table (user_accounts, referral_relationships, planting_orders)
CDC_CONSUMER_GROUPS=( CDC_CONSUMER_GROUPS=(
"contribution-service-cdc-group" "contribution-service-cdc-group"
"contribution-service-cdc-phase-user_accounts"
"contribution-service-cdc-phase-referral_relationships"
"contribution-service-cdc-phase-planting_orders"
"auth-service-cdc-group" "auth-service-cdc-group"
"mining-admin-service-cdc-group" "mining-admin-service-cdc-group"
) )
@ -119,6 +125,14 @@ OUTBOX_CONNECTORS=(
"mining-wallet-outbox-connector" "mining-wallet-outbox-connector"
) )
# Debezium CDC Postgres Connectors (for 1.0 -> 2.0 data sync)
# These connectors capture changes from 1.0 service databases
CDC_POSTGRES_CONNECTORS=(
"identity-postgres-connector"
"referral-postgres-connector"
"planting-postgres-connector"
)
# Debezium Connect URL (default port 8084 as mapped in docker-compose) # Debezium Connect URL (default port 8084 as mapped in docker-compose)
DEBEZIUM_CONNECT_URL="${DEBEZIUM_CONNECT_URL:-http://localhost:8084}" DEBEZIUM_CONNECT_URL="${DEBEZIUM_CONNECT_URL:-http://localhost:8084}"
@ -708,6 +722,148 @@ sync_reset() {
log_info "Run: ./deploy-mining.sh up contribution-service && ./deploy-mining.sh up auth-service" log_info "Run: ./deploy-mining.sh up contribution-service && ./deploy-mining.sh up auth-service"
} }
# Trigger Debezium CDC connectors to re-snapshot
# This is needed when Kafka topic messages are deleted (due to retention or manual cleanup)
# and the connector needs to re-export all data from the source database
cdc_resnapshot() {
print_section "Triggering CDC Connectors Re-Snapshot"
local connect_url="$DEBEZIUM_CONNECT_URL"
# Check if Debezium Connect is available
if ! curl -s "$connect_url" &>/dev/null; then
log_error "Debezium Connect not available at $connect_url"
return 1
fi
echo -e "${YELLOW}WARNING: This will delete and recreate CDC Postgres connectors.${NC}"
echo -e "${YELLOW}All connectors will re-snapshot their source tables.${NC}"
echo ""
echo "Connectors to be re-created:"
for connector in "${CDC_POSTGRES_CONNECTORS[@]}"; do
echo " - $connector"
done
echo ""
read -p "Continue? (y/n): " confirm
if [ "$confirm" != "y" ]; then
log_warn "Aborted"
return 1
fi
# Stop CDC consumer services first
log_step "Stopping CDC consumer services..."
service_stop "contribution-service"
# Wait for consumer groups to become inactive
log_info "Waiting 10 seconds for consumers to disconnect..."
sleep 10
# Delete consumer groups to ensure fresh consumption
log_step "Deleting consumer groups..."
for group in "${CDC_CONSUMER_GROUPS[@]}"; do
log_info "Deleting consumer group: $group"
if docker ps --format '{{.Names}}' 2>/dev/null | grep -q "^${KAFKA_CONTAINER}$"; then
docker exec "$KAFKA_CONTAINER" kafka-consumer-groups --bootstrap-server localhost:9092 \
--delete --group "$group" 2>/dev/null && log_success "Deleted $group" || log_warn "Could not delete $group"
fi
done
# Clear processed_cdc_events table
log_step "Clearing processed CDC events..."
if run_psql "rwa_contribution" "TRUNCATE TABLE processed_cdc_events;" 2>/dev/null; then
log_success "Truncated processed_cdc_events in rwa_contribution"
else
log_warn "Could not truncate processed_cdc_events (table may not exist)"
fi
# For each CDC Postgres connector, save config, delete, and recreate
log_step "Re-creating CDC Postgres connectors..."
local scripts_dir="$SCRIPT_DIR/scripts/debezium"
for connector in "${CDC_POSTGRES_CONNECTORS[@]}"; do
log_info "Processing connector: $connector"
# Get current config from running connector
local config
config=$(curl -s "$connect_url/connectors/$connector/config" 2>/dev/null)
local config_file=""
local use_file_config=false
# If connector doesn't exist, try to find config file
if [ -z "$config" ] || echo "$config" | grep -q "error_code"; then
log_warn "Connector $connector not found, looking for config file..."
# Map connector name to config file
case "$connector" in
"identity-postgres-connector")
config_file="$scripts_dir/identity-connector.json"
;;
"referral-postgres-connector")
config_file="$scripts_dir/referral-connector.json"
;;
"planting-postgres-connector")
config_file="$scripts_dir/planting-connector.json"
;;
esac
if [ -n "$config_file" ] && [ -f "$config_file" ]; then
log_info "Found config file: $config_file"
use_file_config=true
else
log_error "No config available for $connector, skipping"
continue
fi
else
# Delete existing connector
log_info "Deleting connector: $connector"
curl -s -X DELETE "$connect_url/connectors/$connector" &>/dev/null
sleep 2
fi
# Create connector
log_info "Creating connector: $connector with snapshot.mode=always"
local result
if [ "$use_file_config" = true ]; then
# Use config file, replace snapshot.mode with always
local json_config
json_config=$(cat "$config_file" | envsubst | sed 's/"snapshot.mode": "initial"/"snapshot.mode": "always"/')
result=$(echo "$json_config" | curl -s -X POST "$connect_url/connectors" \
-H "Content-Type: application/json" \
-d @- 2>/dev/null)
else
# Use config from running connector, but change snapshot.mode to always
local modified_config
modified_config=$(echo "$config" | sed 's/"snapshot.mode":"initial"/"snapshot.mode":"always"/' | sed 's/"snapshot.mode": "initial"/"snapshot.mode": "always"/')
result=$(curl -s -X POST "$connect_url/connectors" \
-H "Content-Type: application/json" \
-d "{\"name\":\"$connector\",\"config\":$modified_config}" 2>/dev/null)
fi
if echo "$result" | grep -q '"name"'; then
log_success "Created connector: $connector"
else
log_error "Failed to create connector $connector: $result"
fi
# Wait between connectors
sleep 3
done
# Wait for snapshots to complete
log_step "Waiting 30 seconds for Debezium snapshots to complete..."
sleep 30
# Start services
log_step "Starting CDC consumer services..."
service_start "contribution-service"
log_success "CDC re-snapshot completed!"
log_info "Monitor sync progress with: ./deploy-mining.sh sync-status"
}
sync_status() { sync_status() {
print_section "CDC Sync Status" print_section "CDC Sync Status"
@ -1102,9 +1258,47 @@ full_reset() {
service_start "$service" service_start "$service"
done done
log_step "Step 10/18: Waiting for services to be ready and sync from 1.0..." log_step "Step 10/18: Waiting for contribution-service CDC sync to complete..."
log_info "Waiting 30 seconds for all services to start and sync data from 1.0 CDC..." log_info "Waiting for contribution-service to complete CDC sync (user_accounts -> referral_relationships -> planting_orders)..."
sleep 30
# 等待 contribution-service 的 CDC 顺序同步完成
# 通过 /health/cdc-sync API 检查同步状态
local max_wait=600 # 最多等待 10 分钟
local wait_count=0
local sync_completed=false
local cdc_sync_url="http://localhost:3020/api/v2/health/cdc-sync"
while [ "$wait_count" -lt "$max_wait" ] && [ "$sync_completed" = false ]; do
# 调用 API 检查同步状态
local sync_status
sync_status=$(curl -s "$cdc_sync_url" 2>/dev/null || echo '{}')
if echo "$sync_status" | grep -q '"allPhasesCompleted":true'; then
sync_completed=true
log_success "CDC sync completed - all phases finished"
else
# 显示当前状态
local is_running
local sequential_mode
is_running=$(echo "$sync_status" | grep -o '"isRunning":[^,}]*' | cut -d':' -f2)
sequential_mode=$(echo "$sync_status" | grep -o '"sequentialMode":[^,}]*' | cut -d':' -f2)
if [ "$is_running" = "true" ] && [ "$sequential_mode" = "true" ]; then
log_info "CDC sync in progress (sequential mode)... (waited ${wait_count}s)"
elif [ "$is_running" = "true" ]; then
log_info "CDC consumer running... (waited ${wait_count}s)"
else
log_info "Waiting for CDC consumer to start... (waited ${wait_count}s)"
fi
sleep 5
wait_count=$((wait_count + 5))
fi
done
if [ "$sync_completed" = false ]; then
log_warn "CDC sync did not complete within ${max_wait}s, proceeding anyway..."
log_info "You may need to wait longer or check: curl $cdc_sync_url"
fi
log_step "Step 11/18: Registering Debezium outbox connectors..." log_step "Step 11/18: Registering Debezium outbox connectors..."
# Register outbox connectors AFTER services are running and have synced data # Register outbox connectors AFTER services are running and have synced data
@ -1327,6 +1521,7 @@ show_help() {
echo -e "${BOLD}CDC / Sync Management:${NC}" echo -e "${BOLD}CDC / Sync Management:${NC}"
echo " sync-reset Reset CDC consumer to read from beginning" echo " sync-reset Reset CDC consumer to read from beginning"
echo " sync-status Show CDC consumer group status" echo " sync-status Show CDC consumer group status"
echo " cdc-resnapshot Force Debezium CDC connectors to re-snapshot ${YELLOW}(use when Kafka data lost)${NC}"
echo " outbox-register Register all Debezium outbox connectors" echo " outbox-register Register all Debezium outbox connectors"
echo " outbox-status Show outbox connector status" echo " outbox-status Show outbox connector status"
echo " outbox-delete Delete all outbox connectors" echo " outbox-delete Delete all outbox connectors"
@ -1429,6 +1624,10 @@ main() {
sync-status) sync-status)
sync_status sync_status
;; ;;
cdc-resnapshot)
print_header
cdc_resnapshot
;;
# Outbox connector commands # Outbox connector commands
outbox-register) outbox-register)

View File

@ -76,6 +76,10 @@ services:
REDIS_DB: 11 REDIS_DB: 11
# Kafka # Kafka
KAFKA_BROKERS: kafka:29092 KAFKA_BROKERS: kafka:29092
# JWT 配置 (与 auth-service 共享密钥以验证 token)
JWT_SECRET: ${JWT_SECRET:-your-jwt-secret-change-in-production}
# 2.0 内部服务调用
CONTRIBUTION_SERVICE_URL: http://contribution-service:3020
ports: ports:
- "3021:3021" - "3021:3021"
healthcheck: healthcheck:
@ -108,6 +112,8 @@ services:
KAFKA_BROKERS: kafka:29092 KAFKA_BROKERS: kafka:29092
# 2.0 内部服务调用 # 2.0 内部服务调用
MINING_SERVICE_URL: http://mining-service:3021 MINING_SERVICE_URL: http://mining-service:3021
# JWT 配置 (与 auth-service 共享密钥以验证 token)
JWT_SECRET: ${JWT_SECRET:-your-jwt-secret-change-in-production}
ports: ports:
- "3022:3022" - "3022:3022"
healthcheck: healthcheck:

View File

@ -8,12 +8,14 @@
"name": "mining-admin-service", "name": "mining-admin-service",
"version": "1.0.0", "version": "1.0.0",
"dependencies": { "dependencies": {
"@nestjs/axios": "^3.1.3",
"@nestjs/common": "^10.3.0", "@nestjs/common": "^10.3.0",
"@nestjs/config": "^3.1.1", "@nestjs/config": "^3.1.1",
"@nestjs/core": "^10.3.0", "@nestjs/core": "^10.3.0",
"@nestjs/platform-express": "^10.3.0", "@nestjs/platform-express": "^10.3.0",
"@nestjs/swagger": "^7.1.17", "@nestjs/swagger": "^7.1.17",
"@prisma/client": "^5.7.1", "@prisma/client": "^5.7.1",
"axios": "^1.13.2",
"bcrypt": "^5.1.1", "bcrypt": "^5.1.1",
"class-transformer": "^0.5.1", "class-transformer": "^0.5.1",
"class-validator": "^0.14.0", "class-validator": "^0.14.0",
@ -23,7 +25,8 @@
"kafkajs": "^2.2.4", "kafkajs": "^2.2.4",
"reflect-metadata": "^0.1.14", "reflect-metadata": "^0.1.14",
"rxjs": "^7.8.1", "rxjs": "^7.8.1",
"swagger-ui-express": "^5.0.0" "swagger-ui-express": "^5.0.0",
"xlsx": "^0.18.5"
}, },
"devDependencies": { "devDependencies": {
"@nestjs/cli": "^10.2.1", "@nestjs/cli": "^10.2.1",
@ -32,6 +35,7 @@
"@types/bcrypt": "^6.0.0", "@types/bcrypt": "^6.0.0",
"@types/express": "^4.17.21", "@types/express": "^4.17.21",
"@types/jsonwebtoken": "^9.0.10", "@types/jsonwebtoken": "^9.0.10",
"@types/multer": "^1.4.13",
"@types/node": "^20.10.5", "@types/node": "^20.10.5",
"eslint": "^8.56.0", "eslint": "^8.56.0",
"prettier": "^3.1.1", "prettier": "^3.1.1",
@ -627,6 +631,17 @@
"integrity": "sha512-4aErSrCR/On/e5G2hDP0wjooqDdauzEbIq8hIkIe5pXV0rtWJZvdCEKL0ykZxex+IxIwBp0eGeV48hQN07dXtw==", "integrity": "sha512-4aErSrCR/On/e5G2hDP0wjooqDdauzEbIq8hIkIe5pXV0rtWJZvdCEKL0ykZxex+IxIwBp0eGeV48hQN07dXtw==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/@nestjs/axios": {
"version": "3.1.3",
"resolved": "https://registry.npmjs.org/@nestjs/axios/-/axios-3.1.3.tgz",
"integrity": "sha512-RZ/63c1tMxGLqyG3iOCVt7A72oy4x1eM6QEhd4KzCYpaVWW0igq0WSREeRoEZhIxRcZfDfIIkvsOMiM7yfVGZQ==",
"license": "MIT",
"peerDependencies": {
"@nestjs/common": "^7.0.0 || ^8.0.0 || ^9.0.0 || ^10.0.0",
"axios": "^1.3.1",
"rxjs": "^6.0.0 || ^7.0.0"
}
},
"node_modules/@nestjs/cli": { "node_modules/@nestjs/cli": {
"version": "10.4.9", "version": "10.4.9",
"resolved": "https://registry.npmjs.org/@nestjs/cli/-/cli-10.4.9.tgz", "resolved": "https://registry.npmjs.org/@nestjs/cli/-/cli-10.4.9.tgz",
@ -1206,6 +1221,16 @@
"dev": true, "dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/@types/multer": {
"version": "1.4.13",
"resolved": "https://registry.npmjs.org/@types/multer/-/multer-1.4.13.tgz",
"integrity": "sha512-bhhdtPw7JqCiEfC9Jimx5LqX9BDIPJEh2q/fQ4bqbBPtyEZYr3cvF22NwG0DmPZNYA0CAf2CnqDB4KIGGpJcaw==",
"dev": true,
"license": "MIT",
"dependencies": {
"@types/express": "*"
}
},
"node_modules/@types/node": { "node_modules/@types/node": {
"version": "20.19.28", "version": "20.19.28",
"resolved": "https://registry.npmjs.org/@types/node/-/node-20.19.28.tgz", "resolved": "https://registry.npmjs.org/@types/node/-/node-20.19.28.tgz",
@ -1494,6 +1519,15 @@
"acorn": "^6.0.0 || ^7.0.0 || ^8.0.0" "acorn": "^6.0.0 || ^7.0.0 || ^8.0.0"
} }
}, },
"node_modules/adler-32": {
"version": "1.3.1",
"resolved": "https://registry.npmjs.org/adler-32/-/adler-32-1.3.1.tgz",
"integrity": "sha512-ynZ4w/nUUv5rrsR8UUGoe1VC9hZj6V5hU9Qw1HlMDJGEJw5S7TfTErWTjMys6M7vr0YWcPqs3qAr4ss0nDfP+A==",
"license": "Apache-2.0",
"engines": {
"node": ">=0.8"
}
},
"node_modules/agent-base": { "node_modules/agent-base": {
"version": "6.0.2", "version": "6.0.2",
"resolved": "https://registry.npmjs.org/agent-base/-/agent-base-6.0.2.tgz", "resolved": "https://registry.npmjs.org/agent-base/-/agent-base-6.0.2.tgz",
@ -1734,6 +1768,24 @@
"dev": true, "dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/asynckit": {
"version": "0.4.0",
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==",
"license": "MIT"
},
"node_modules/axios": {
"version": "1.13.2",
"resolved": "https://registry.npmjs.org/axios/-/axios-1.13.2.tgz",
"integrity": "sha512-VPk9ebNqPcy5lRGuSlKx752IlDatOjT9paPlm8A7yOuW2Fbvp4X3JznJtT4f0GzGLLiWE9W8onz51SqLYwzGaA==",
"license": "MIT",
"peer": true,
"dependencies": {
"follow-redirects": "^1.15.6",
"form-data": "^4.0.4",
"proxy-from-env": "^1.1.0"
}
},
"node_modules/balanced-match": { "node_modules/balanced-match": {
"version": "1.0.2", "version": "1.0.2",
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz", "resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
@ -2028,6 +2080,19 @@
], ],
"license": "CC-BY-4.0" "license": "CC-BY-4.0"
}, },
"node_modules/cfb": {
"version": "1.2.2",
"resolved": "https://registry.npmjs.org/cfb/-/cfb-1.2.2.tgz",
"integrity": "sha512-KfdUZsSOw19/ObEWasvBP/Ac4reZvAGauZhs6S/gqNhXhI7cKwvlH7ulj+dOEYnca4bm4SGo8C1bTAQvnTjgQA==",
"license": "Apache-2.0",
"dependencies": {
"adler-32": "~1.3.0",
"crc-32": "~1.2.0"
},
"engines": {
"node": ">=0.8"
}
},
"node_modules/chalk": { "node_modules/chalk": {
"version": "4.1.2", "version": "4.1.2",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz", "resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
@ -2185,6 +2250,15 @@
"node": ">=0.10.0" "node": ">=0.10.0"
} }
}, },
"node_modules/codepage": {
"version": "1.15.0",
"resolved": "https://registry.npmjs.org/codepage/-/codepage-1.15.0.tgz",
"integrity": "sha512-3g6NUTPd/YtuuGrhMnOMRjFc+LJw/bnMp3+0r/Wcz3IXUuCosKRJvMphm5+Q+bvTVGcJJuRvVLuYba+WojaFaA==",
"license": "Apache-2.0",
"engines": {
"node": ">=0.8"
}
},
"node_modules/color-convert": { "node_modules/color-convert": {
"version": "2.0.1", "version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz", "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
@ -2212,6 +2286,18 @@
"color-support": "bin.js" "color-support": "bin.js"
} }
}, },
"node_modules/combined-stream": {
"version": "1.0.8",
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
"license": "MIT",
"dependencies": {
"delayed-stream": "~1.0.0"
},
"engines": {
"node": ">= 0.8"
}
},
"node_modules/commander": { "node_modules/commander": {
"version": "4.1.1", "version": "4.1.1",
"resolved": "https://registry.npmjs.org/commander/-/commander-4.1.1.tgz", "resolved": "https://registry.npmjs.org/commander/-/commander-4.1.1.tgz",
@ -2355,6 +2441,18 @@
} }
} }
}, },
"node_modules/crc-32": {
"version": "1.2.2",
"resolved": "https://registry.npmjs.org/crc-32/-/crc-32-1.2.2.tgz",
"integrity": "sha512-ROmzCKrTnOwybPcJApAA6WBWij23HVfGVNKqqrZpuyZOHqK2CwHSvpGuyt/UNNvaIjEd8X5IFGp4Mh+Ie1IHJQ==",
"license": "Apache-2.0",
"bin": {
"crc32": "bin/crc32.njs"
},
"engines": {
"node": ">=0.8"
}
},
"node_modules/cross-spawn": { "node_modules/cross-spawn": {
"version": "7.0.6", "version": "7.0.6",
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz", "resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz",
@ -2433,6 +2531,15 @@
"url": "https://github.com/sponsors/ljharb" "url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/delayed-stream": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
"integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==",
"license": "MIT",
"engines": {
"node": ">=0.4.0"
}
},
"node_modules/delegates": { "node_modules/delegates": {
"version": "1.0.0", "version": "1.0.0",
"resolved": "https://registry.npmjs.org/delegates/-/delegates-1.0.0.tgz", "resolved": "https://registry.npmjs.org/delegates/-/delegates-1.0.0.tgz",
@ -2629,6 +2736,21 @@
"node": ">= 0.4" "node": ">= 0.4"
} }
}, },
"node_modules/es-set-tostringtag": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz",
"integrity": "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==",
"license": "MIT",
"dependencies": {
"es-errors": "^1.3.0",
"get-intrinsic": "^1.2.6",
"has-tostringtag": "^1.0.2",
"hasown": "^2.0.2"
},
"engines": {
"node": ">= 0.4"
}
},
"node_modules/escalade": { "node_modules/escalade": {
"version": "3.2.0", "version": "3.2.0",
"resolved": "https://registry.npmjs.org/escalade/-/escalade-3.2.0.tgz", "resolved": "https://registry.npmjs.org/escalade/-/escalade-3.2.0.tgz",
@ -3136,6 +3258,26 @@
"dev": true, "dev": true,
"license": "ISC" "license": "ISC"
}, },
"node_modules/follow-redirects": {
"version": "1.15.11",
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.11.tgz",
"integrity": "sha512-deG2P0JfjrTxl50XGCDyfI97ZGVCxIpfKYmfyrQ54n5FO/0gfIES8C/Psl6kWVDolizcaaxZJnTS0QSMxvnsBQ==",
"funding": [
{
"type": "individual",
"url": "https://github.com/sponsors/RubenVerborgh"
}
],
"license": "MIT",
"engines": {
"node": ">=4.0"
},
"peerDependenciesMeta": {
"debug": {
"optional": true
}
}
},
"node_modules/foreground-child": { "node_modules/foreground-child": {
"version": "3.3.1", "version": "3.3.1",
"resolved": "https://registry.npmjs.org/foreground-child/-/foreground-child-3.3.1.tgz", "resolved": "https://registry.npmjs.org/foreground-child/-/foreground-child-3.3.1.tgz",
@ -3182,6 +3324,22 @@
"webpack": "^5.11.0" "webpack": "^5.11.0"
} }
}, },
"node_modules/form-data": {
"version": "4.0.5",
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.5.tgz",
"integrity": "sha512-8RipRLol37bNs2bhoV67fiTEvdTrbMUYcFTiy3+wuuOnUog2QBHCZWXDRijWQfAkhBj2Uf5UnVaiWwA5vdd82w==",
"license": "MIT",
"dependencies": {
"asynckit": "^0.4.0",
"combined-stream": "^1.0.8",
"es-set-tostringtag": "^2.1.0",
"hasown": "^2.0.2",
"mime-types": "^2.1.12"
},
"engines": {
"node": ">= 6"
}
},
"node_modules/forwarded": { "node_modules/forwarded": {
"version": "0.2.0", "version": "0.2.0",
"resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz", "resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
@ -3191,6 +3349,15 @@
"node": ">= 0.6" "node": ">= 0.6"
} }
}, },
"node_modules/frac": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/frac/-/frac-1.1.2.tgz",
"integrity": "sha512-w/XBfkibaTl3YDqASwfDUqkna4Z2p9cFSr1aHDt0WoMTECnRfBOv2WArlZILlqgWlmdIlALXGpM2AOhEk5W3IA==",
"license": "Apache-2.0",
"engines": {
"node": ">=0.8"
}
},
"node_modules/fresh": { "node_modules/fresh": {
"version": "0.5.2", "version": "0.5.2",
"resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz", "resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz",
@ -3493,6 +3660,21 @@
"url": "https://github.com/sponsors/ljharb" "url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/has-tostringtag": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/has-tostringtag/-/has-tostringtag-1.0.2.tgz",
"integrity": "sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==",
"license": "MIT",
"dependencies": {
"has-symbols": "^1.0.3"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/has-unicode": { "node_modules/has-unicode": {
"version": "2.0.1", "version": "2.0.1",
"resolved": "https://registry.npmjs.org/has-unicode/-/has-unicode-2.0.1.tgz", "resolved": "https://registry.npmjs.org/has-unicode/-/has-unicode-2.0.1.tgz",
@ -4878,6 +5060,12 @@
"node": ">= 0.10" "node": ">= 0.10"
} }
}, },
"node_modules/proxy-from-env": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
"integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==",
"license": "MIT"
},
"node_modules/punycode": { "node_modules/punycode": {
"version": "2.3.1", "version": "2.3.1",
"resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz", "resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz",
@ -5450,6 +5638,18 @@
"node": ">=0.10.0" "node": ">=0.10.0"
} }
}, },
"node_modules/ssf": {
"version": "0.11.2",
"resolved": "https://registry.npmjs.org/ssf/-/ssf-0.11.2.tgz",
"integrity": "sha512-+idbmIXoYET47hH+d7dfm2epdOMUDjqcB4648sTZ+t2JwoyBFL/insLfB/racrDmsKB3diwsDA696pZMieAC5g==",
"license": "Apache-2.0",
"dependencies": {
"frac": "~1.1.2"
},
"engines": {
"node": ">=0.8"
}
},
"node_modules/standard-as-callback": { "node_modules/standard-as-callback": {
"version": "2.1.0", "version": "2.1.0",
"resolved": "https://registry.npmjs.org/standard-as-callback/-/standard-as-callback-2.1.0.tgz", "resolved": "https://registry.npmjs.org/standard-as-callback/-/standard-as-callback-2.1.0.tgz",
@ -6252,6 +6452,24 @@
"string-width": "^1.0.2 || 2 || 3 || 4" "string-width": "^1.0.2 || 2 || 3 || 4"
} }
}, },
"node_modules/wmf": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/wmf/-/wmf-1.0.2.tgz",
"integrity": "sha512-/p9K7bEh0Dj6WbXg4JG0xvLQmIadrner1bi45VMJTfnbVHsc7yIajZyoSoK60/dtVBs12Fm6WkUI5/3WAVsNMw==",
"license": "Apache-2.0",
"engines": {
"node": ">=0.8"
}
},
"node_modules/word": {
"version": "0.3.0",
"resolved": "https://registry.npmjs.org/word/-/word-0.3.0.tgz",
"integrity": "sha512-OELeY0Q61OXpdUfTp+oweA/vtLVg5VDOXh+3he3PNzLGG/y0oylSOC1xRVj0+l4vQ3tj/bB1HVHv1ocXkQceFA==",
"license": "Apache-2.0",
"engines": {
"node": ">=0.8"
}
},
"node_modules/word-wrap": { "node_modules/word-wrap": {
"version": "1.2.5", "version": "1.2.5",
"resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz", "resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz",
@ -6302,6 +6520,27 @@
"integrity": "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ==", "integrity": "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ==",
"license": "ISC" "license": "ISC"
}, },
"node_modules/xlsx": {
"version": "0.18.5",
"resolved": "https://registry.npmjs.org/xlsx/-/xlsx-0.18.5.tgz",
"integrity": "sha512-dmg3LCjBPHZnQp5/F/+nnTa+miPJxUXB6vtk42YjBBKayDNagxGEeIdWApkYPOf3Z3pm3k62Knjzp7lMeTEtFQ==",
"license": "Apache-2.0",
"dependencies": {
"adler-32": "~1.3.0",
"cfb": "~1.2.1",
"codepage": "~1.15.0",
"crc-32": "~1.2.1",
"ssf": "~0.11.2",
"wmf": "~1.0.1",
"word": "~0.3.0"
},
"bin": {
"xlsx": "bin/xlsx.njs"
},
"engines": {
"node": ">=0.8"
}
},
"node_modules/xtend": { "node_modules/xtend": {
"version": "4.0.2", "version": "4.0.2",
"resolved": "https://registry.npmjs.org/xtend/-/xtend-4.0.2.tgz", "resolved": "https://registry.npmjs.org/xtend/-/xtend-4.0.2.tgz",

View File

@ -15,12 +15,14 @@
"prisma:migrate": "prisma migrate dev" "prisma:migrate": "prisma migrate dev"
}, },
"dependencies": { "dependencies": {
"@nestjs/axios": "^3.1.3",
"@nestjs/common": "^10.3.0", "@nestjs/common": "^10.3.0",
"@nestjs/config": "^3.1.1", "@nestjs/config": "^3.1.1",
"@nestjs/core": "^10.3.0", "@nestjs/core": "^10.3.0",
"@nestjs/platform-express": "^10.3.0", "@nestjs/platform-express": "^10.3.0",
"@nestjs/swagger": "^7.1.17", "@nestjs/swagger": "^7.1.17",
"@prisma/client": "^5.7.1", "@prisma/client": "^5.7.1",
"axios": "^1.13.2",
"bcrypt": "^5.1.1", "bcrypt": "^5.1.1",
"class-transformer": "^0.5.1", "class-transformer": "^0.5.1",
"class-validator": "^0.14.0", "class-validator": "^0.14.0",
@ -30,7 +32,8 @@
"kafkajs": "^2.2.4", "kafkajs": "^2.2.4",
"reflect-metadata": "^0.1.14", "reflect-metadata": "^0.1.14",
"rxjs": "^7.8.1", "rxjs": "^7.8.1",
"swagger-ui-express": "^5.0.0" "swagger-ui-express": "^5.0.0",
"xlsx": "^0.18.5"
}, },
"devDependencies": { "devDependencies": {
"@nestjs/cli": "^10.2.1", "@nestjs/cli": "^10.2.1",
@ -39,6 +42,7 @@
"@types/bcrypt": "^6.0.0", "@types/bcrypt": "^6.0.0",
"@types/express": "^4.17.21", "@types/express": "^4.17.21",
"@types/jsonwebtoken": "^9.0.10", "@types/jsonwebtoken": "^9.0.10",
"@types/multer": "^1.4.13",
"@types/node": "^20.10.5", "@types/node": "^20.10.5",
"eslint": "^8.56.0", "eslint": "^8.56.0",
"prettier": "^3.1.1", "prettier": "^3.1.1",

View File

@ -1,8 +1,8 @@
-- ============================================================================ -- ============================================================================
-- mining-admin-service 初始化 migration -- mining-admin-service 初始化 migration
-- 合并自: 20260111000000_init, 20260112110000_add_referral_adoption_nickname, -- 合并自: 0001_init, 0002_fix_processed_event_composite_key,
-- 20260112150000_add_unlocked_bonus_tiers, 20260112200000_add_contribution_records_network_progress, -- 20250120000001_add_region_to_synced_system_contributions,
-- 20260113000000_use_prisma_relation_mode, 20260113100000_add_distribution_summary -- 20250120000002_add_synced_system_contribution_records
-- 注意: 使用 Prisma relationMode = "prisma"不在数据库层创建FK约束 -- 注意: 使用 Prisma relationMode = "prisma"不在数据库层创建FK约束
-- ============================================================================ -- ============================================================================
@ -302,10 +302,11 @@ CREATE TABLE "synced_circulation_pools" (
CONSTRAINT "synced_circulation_pools_pkey" PRIMARY KEY ("id") CONSTRAINT "synced_circulation_pools_pkey" PRIMARY KEY ("id")
); );
-- CreateTable -- CreateTable: 系统账户算力 (from contribution-service)
CREATE TABLE "synced_system_contributions" ( CREATE TABLE "synced_system_contributions" (
"id" TEXT NOT NULL, "id" TEXT NOT NULL,
"accountType" TEXT NOT NULL, "accountType" TEXT NOT NULL,
"region_code" TEXT,
"name" TEXT NOT NULL, "name" TEXT NOT NULL,
"contributionBalance" DECIMAL(30,8) NOT NULL DEFAULT 0, "contributionBalance" DECIMAL(30,8) NOT NULL DEFAULT 0,
"contributionNeverExpires" BOOLEAN NOT NULL DEFAULT false, "contributionNeverExpires" BOOLEAN NOT NULL DEFAULT false,
@ -687,8 +688,12 @@ CREATE UNIQUE INDEX "synced_daily_mining_stats_statDate_key" ON "synced_daily_mi
-- CreateIndex -- CreateIndex
CREATE UNIQUE INDEX "synced_day_klines_klineDate_key" ON "synced_day_klines"("klineDate"); CREATE UNIQUE INDEX "synced_day_klines_klineDate_key" ON "synced_day_klines"("klineDate");
-- CreateIndex -- CreateIndex: synced_system_contributions
CREATE UNIQUE INDEX "synced_system_contributions_accountType_key" ON "synced_system_contributions"("accountType"); -- 使用 accountType + COALESCE(region_code, '__NULL__') 复合唯一键
-- 注意PostgreSQL 中 NULL != NULL所以直接用 region_code 做唯一索引无法阻止重复的 (OPERATION, NULL)
CREATE UNIQUE INDEX "synced_system_contributions_accountType_region_code_key" ON "synced_system_contributions"("accountType", COALESCE(region_code, '__NULL__'));
CREATE INDEX "synced_system_contributions_accountType_idx" ON "synced_system_contributions"("accountType");
CREATE INDEX "synced_system_contributions_region_code_idx" ON "synced_system_contributions"("region_code");
-- CreateIndex -- CreateIndex
CREATE UNIQUE INDEX "cdc_sync_progress_sourceTopic_key" ON "cdc_sync_progress"("sourceTopic"); CREATE UNIQUE INDEX "cdc_sync_progress_sourceTopic_key" ON "cdc_sync_progress"("sourceTopic");
@ -696,11 +701,8 @@ CREATE UNIQUE INDEX "cdc_sync_progress_sourceTopic_key" ON "cdc_sync_progress"("
-- CreateIndex -- CreateIndex
CREATE INDEX "cdc_sync_progress_sourceService_idx" ON "cdc_sync_progress"("sourceService"); CREATE INDEX "cdc_sync_progress_sourceService_idx" ON "cdc_sync_progress"("sourceService");
-- CreateIndex -- CreateIndex (使用复合唯一键替代单独的 eventId 唯一约束)
CREATE UNIQUE INDEX "processed_events_eventId_key" ON "processed_events"("eventId"); CREATE UNIQUE INDEX "processed_events_sourceService_eventId_key" ON "processed_events"("sourceService", "eventId");
-- CreateIndex
CREATE INDEX "processed_events_sourceService_idx" ON "processed_events"("sourceService");
-- CreateIndex -- CreateIndex
CREATE INDEX "processed_events_processedAt_idx" ON "processed_events"("processedAt"); CREATE INDEX "processed_events_processedAt_idx" ON "processed_events"("processedAt");
@ -860,3 +862,40 @@ CREATE UNIQUE INDEX "synced_fee_configs_fee_type_key" ON "synced_fee_configs"("f
-- AddForeignKey (保留 admin 相关的外键) -- AddForeignKey (保留 admin 相关的外键)
ALTER TABLE "audit_logs" ADD CONSTRAINT "audit_logs_adminId_fkey" FOREIGN KEY ("adminId") REFERENCES "admin_users"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "audit_logs" ADD CONSTRAINT "audit_logs_adminId_fkey" FOREIGN KEY ("adminId") REFERENCES "admin_users"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- ============================================================================
-- 系统账户算力明细同步表
-- 用于存储从 contribution-service 同步的系统账户算力来源明细
-- ============================================================================
-- CreateTable: 系统账户算力明细 (from contribution-service)
CREATE TABLE "synced_system_contribution_records" (
"id" TEXT NOT NULL,
"original_record_id" BIGINT NOT NULL,
"account_type" TEXT NOT NULL,
"region_code" TEXT,
"source_adoption_id" BIGINT NOT NULL,
"source_account_sequence" TEXT NOT NULL,
-- 来源类型: FIXED_RATE(固定比例) / LEVEL_OVERFLOW(层级溢出) / LEVEL_NO_ANCESTOR(无上线) / BONUS_TIER_1/2/3(团队奖励未解锁)
"source_type" VARCHAR(30) NOT NULL,
-- 层级深度1-15仅对 LEVEL_OVERFLOW 和 LEVEL_NO_ANCESTOR 类型有效
"level_depth" INTEGER,
"distribution_rate" DECIMAL(10,6) NOT NULL,
"amount" DECIMAL(30,10) NOT NULL,
"effective_date" DATE NOT NULL,
"expire_date" DATE,
"is_expired" BOOLEAN NOT NULL DEFAULT false,
"created_at" TIMESTAMP(3) NOT NULL,
"syncedAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
CONSTRAINT "synced_system_contribution_records_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "synced_system_contribution_records_original_record_id_key" ON "synced_system_contribution_records"("original_record_id");
CREATE INDEX "synced_system_contribution_records_account_type_region_code_idx" ON "synced_system_contribution_records"("account_type", "region_code");
CREATE INDEX "synced_system_contribution_records_source_adoption_id_idx" ON "synced_system_contribution_records"("source_adoption_id");
CREATE INDEX "synced_system_contribution_records_source_account_sequence_idx" ON "synced_system_contribution_records"("source_account_sequence");
CREATE INDEX "synced_system_contribution_records_source_type_idx" ON "synced_system_contribution_records"("source_type");
CREATE INDEX "synced_system_contribution_records_created_at_idx" ON "synced_system_contribution_records"("created_at" DESC);

View File

@ -1,26 +0,0 @@
-- ============================================================================
-- 修复 processed_events 表的幂等键
-- 用于 2.0 服务间 Outbox 事件的 100% exactly-once 语义
-- ============================================================================
--
-- 问题: 原来使用 eventId 作为唯一键,但不同服务的 outbox ID 可能相同
-- 解决: 使用 (sourceService, eventId) 作为复合唯一键
--
-- 唯一键说明:
-- - sourceService: 发送事件的服务名(如 "auth-service", "contribution-service"
-- - eventId: 发送方 outbox 表的自增 ID非 UUID而是数据库自增主键
-- - 组合后在全局唯一,可用于精确追踪事件来源
-- ============================================================================
-- 先清空已有数据(因为之前的数据可能有冲突)
TRUNCATE TABLE "processed_events";
-- 删除旧的唯一索引(仅 eventId
DROP INDEX IF EXISTS "processed_events_eventId_key";
-- 删除旧的 sourceService 普通索引
DROP INDEX IF EXISTS "processed_events_sourceService_idx";
-- 创建新的复合唯一索引:(sourceService, eventId)
-- 这个组合保证跨服务的唯一性
CREATE UNIQUE INDEX "processed_events_sourceService_eventId_key" ON "processed_events"("sourceService", "eventId");

View File

@ -422,16 +422,62 @@ model SyncedCirculationPool {
model SyncedSystemContribution { model SyncedSystemContribution {
id String @id @default(uuid()) id String @id @default(uuid())
accountType String @unique // OPERATION, PROVINCE, CITY, HEADQUARTERS accountType String // OPERATION / PROVINCE / CITY / HEADQUARTERS
regionCode String? @map("region_code") // 省/市代码,如 440000, 440100
name String name String
contributionBalance Decimal @db.Decimal(30, 8) @default(0) contributionBalance Decimal @db.Decimal(30, 8) @default(0)
contributionNeverExpires Boolean @default(false) contributionNeverExpires Boolean @default(false)
syncedAt DateTime @default(now()) syncedAt DateTime @default(now())
updatedAt DateTime @updatedAt updatedAt DateTime @updatedAt
@@unique([accountType, regionCode])
@@index([accountType])
@@index([regionCode])
@@map("synced_system_contributions") @@map("synced_system_contributions")
} }
// =============================================================================
// CDC 同步表 - 系统账户算力明细 (from contribution-service)
// =============================================================================
model SyncedSystemContributionRecord {
id String @id @default(uuid())
originalRecordId BigInt @unique @map("original_record_id") // contribution-service 中的原始 ID
// 系统账户信息(冗余存储,便于查询)
accountType String @map("account_type") // OPERATION / PROVINCE / CITY / HEADQUARTERS
regionCode String? @map("region_code") // 省/市代码
// 来源信息
sourceAdoptionId BigInt @map("source_adoption_id") // 来源认种ID
sourceAccountSequence String @map("source_account_sequence") // 认种人账号
// 来源类型: FIXED_RATE(固定比例) / LEVEL_OVERFLOW(层级溢出) / LEVEL_NO_ANCESTOR(无上线) / BONUS_TIER_1/2/3(团队奖励未解锁)
sourceType String @map("source_type") @db.VarChar(30)
// 层级深度1-15仅对 LEVEL_OVERFLOW 和 LEVEL_NO_ANCESTOR 类型有效
levelDepth Int? @map("level_depth")
// 分配参数
distributionRate Decimal @map("distribution_rate") @db.Decimal(10, 6) // 分配比例
amount Decimal @map("amount") @db.Decimal(30, 10) // 算力金额
// 有效期
effectiveDate DateTime @map("effective_date") @db.Date // 生效日期
expireDate DateTime? @map("expire_date") @db.Date // 过期日期系统账户一般为null永不过期
isExpired Boolean @default(false) @map("is_expired")
createdAt DateTime @map("created_at") // 原始记录创建时间
syncedAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([accountType, regionCode])
@@index([sourceAdoptionId])
@@index([sourceAccountSequence])
@@index([sourceType])
@@index([createdAt(sort: Desc)])
@@map("synced_system_contribution_records")
}
// ============================================================================= // =============================================================================
// CDC 同步进度跟踪 // CDC 同步进度跟踪
// ============================================================================= // =============================================================================

View File

@ -3,11 +3,14 @@ import { ApplicationModule } from '../application/application.module';
import { AuthController } from './controllers/auth.controller'; import { AuthController } from './controllers/auth.controller';
import { DashboardController } from './controllers/dashboard.controller'; import { DashboardController } from './controllers/dashboard.controller';
import { ConfigController } from './controllers/config.controller'; import { ConfigController } from './controllers/config.controller';
import { InitializationController } from './controllers/initialization.controller';
import { AuditController } from './controllers/audit.controller'; import { AuditController } from './controllers/audit.controller';
import { HealthController } from './controllers/health.controller'; import { HealthController } from './controllers/health.controller';
import { UsersController } from './controllers/users.controller'; import { UsersController } from './controllers/users.controller';
import { SystemAccountsController } from './controllers/system-accounts.controller'; import { SystemAccountsController } from './controllers/system-accounts.controller';
import { ReportsController } from './controllers/reports.controller';
import { ManualMiningController } from './controllers/manual-mining.controller';
import { PendingContributionsController } from './controllers/pending-contributions.controller';
import { BatchMiningController } from './controllers/batch-mining.controller';
@Module({ @Module({
imports: [ApplicationModule], imports: [ApplicationModule],
@ -15,11 +18,14 @@ import { SystemAccountsController } from './controllers/system-accounts.controll
AuthController, AuthController,
DashboardController, DashboardController,
ConfigController, ConfigController,
InitializationController,
AuditController, AuditController,
HealthController, HealthController,
UsersController, UsersController,
SystemAccountsController, SystemAccountsController,
ReportsController,
ManualMiningController,
PendingContributionsController,
BatchMiningController,
], ],
}) })
export class ApiModule {} export class ApiModule {}

View File

@ -4,7 +4,7 @@ import { DashboardService } from '../../application/services/dashboard.service';
@ApiTags('Audit') @ApiTags('Audit')
@ApiBearerAuth() @ApiBearerAuth()
@Controller('audit-logs') @Controller('audit')
export class AuditController { export class AuditController {
constructor(private readonly dashboardService: DashboardService) {} constructor(private readonly dashboardService: DashboardService) {}
@ -13,15 +13,42 @@ export class AuditController {
@ApiQuery({ name: 'adminId', required: false }) @ApiQuery({ name: 'adminId', required: false })
@ApiQuery({ name: 'action', required: false }) @ApiQuery({ name: 'action', required: false })
@ApiQuery({ name: 'resource', required: false }) @ApiQuery({ name: 'resource', required: false })
@ApiQuery({ name: 'keyword', required: false })
@ApiQuery({ name: 'page', required: false, type: Number }) @ApiQuery({ name: 'page', required: false, type: Number })
@ApiQuery({ name: 'pageSize', required: false, type: Number }) @ApiQuery({ name: 'pageSize', required: false, type: Number })
async getAuditLogs( async getAuditLogs(
@Query('adminId') adminId?: string, @Query('adminId') adminId?: string,
@Query('action') action?: string, @Query('action') action?: string,
@Query('resource') resource?: string, @Query('resource') resource?: string,
@Query('keyword') keyword?: string,
@Query('page') page?: number, @Query('page') page?: number,
@Query('pageSize') pageSize?: number, @Query('pageSize') pageSize?: number,
) { ) {
return this.dashboardService.getAuditLogs({ adminId, action, resource, page: page ?? 1, pageSize: pageSize ?? 50 }); const result = await this.dashboardService.getAuditLogs({
adminId,
action,
resource,
page: page ?? 1,
pageSize: pageSize ?? 20,
});
// 转换为前端期望的格式
return {
items: result.data.map((log: any) => ({
id: log.id,
adminId: log.adminId,
adminUsername: log.admin?.username || 'unknown',
action: log.action,
resource: log.resource,
resourceId: log.resourceId,
details: log.newValue ? JSON.stringify(log.newValue) : null,
ipAddress: log.ipAddress || '-',
createdAt: log.createdAt,
})),
total: result.total,
page: result.pagination.page,
pageSize: result.pagination.pageSize,
totalPages: result.pagination.totalPages,
};
} }
} }

View File

@ -0,0 +1,364 @@
import {
Controller,
Get,
Post,
Body,
Req,
HttpException,
HttpStatus,
UseInterceptors,
UploadedFile,
Logger,
} from '@nestjs/common';
import {
ApiTags,
ApiOperation,
ApiBearerAuth,
ApiBody,
ApiConsumes,
} from '@nestjs/swagger';
import { FileInterceptor } from '@nestjs/platform-express';
import * as XLSX from 'xlsx';
import { BatchMiningService, BatchMiningItem } from '../../application/services/batch-mining.service';
@ApiTags('Batch Mining')
@ApiBearerAuth()
@Controller('batch-mining')
export class BatchMiningController {
private readonly logger = new Logger(BatchMiningController.name);
constructor(private readonly batchMiningService: BatchMiningService) {}
@Get('status')
@ApiOperation({ summary: '获取批量补发状态(是否已执行)' })
async getStatus() {
this.logger.log(`[GET /batch-mining/status] 请求获取批量补发状态`);
try {
const result = await this.batchMiningService.getStatus();
this.logger.log(`[GET /batch-mining/status] 返回: ${JSON.stringify(result)}`);
return result;
} catch (error) {
this.logger.error(`[GET /batch-mining/status] 错误:`, error);
throw error;
}
}
@Post('upload-preview')
@ApiOperation({ summary: '上传 Excel 文件并预览(不执行)' })
@ApiConsumes('multipart/form-data')
@ApiBody({
schema: {
type: 'object',
properties: {
file: {
type: 'string',
format: 'binary',
description: 'Excel 文件 (.xlsx)',
},
},
},
})
@UseInterceptors(FileInterceptor('file'))
async uploadAndPreview(@UploadedFile() file: Express.Multer.File) {
this.logger.log(`[POST /batch-mining/upload-preview] 开始处理上传预览请求`);
if (!file) {
this.logger.error(`[POST /batch-mining/upload-preview] 未收到文件`);
throw new HttpException('请上传文件', HttpStatus.BAD_REQUEST);
}
this.logger.log(`[POST /batch-mining/upload-preview] 收到文件: ${file.originalname}, 大小: ${file.size}, 类型: ${file.mimetype}`);
// 检查文件类型
const validTypes = [
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
'application/vnd.ms-excel',
];
if (!validTypes.includes(file.mimetype) && !file.originalname.endsWith('.xlsx')) {
this.logger.error(`[POST /batch-mining/upload-preview] 文件类型不正确: ${file.mimetype}`);
throw new HttpException('请上传 Excel 文件 (.xlsx)', HttpStatus.BAD_REQUEST);
}
try {
// 解析 Excel
this.logger.log(`[POST /batch-mining/upload-preview] 开始解析 Excel...`);
const workbook = XLSX.read(file.buffer, { type: 'buffer' });
this.logger.log(`[POST /batch-mining/upload-preview] Excel Sheet 列表: ${workbook.SheetNames.join(', ')}`);
const sheetName = workbook.SheetNames[0];
const worksheet = workbook.Sheets[sheetName];
// 尝试读取 Sheet2如果存在
const actualSheetName = workbook.SheetNames.includes('Sheet2') ? 'Sheet2' : sheetName;
const actualSheet = workbook.Sheets[actualSheetName];
this.logger.log(`[POST /batch-mining/upload-preview] 使用 Sheet: ${actualSheetName}`);
// 转换为数组
const rows: any[][] = XLSX.utils.sheet_to_json(actualSheet, { header: 1 });
this.logger.log(`[POST /batch-mining/upload-preview] Excel 总行数: ${rows.length}`);
// 解析数据
const items = this.batchMiningService.parseExcelData(rows);
this.logger.log(`[POST /batch-mining/upload-preview] 解析后有效数据: ${items.length}`);
if (items.length === 0) {
this.logger.error(`[POST /batch-mining/upload-preview] Excel 文件中没有有效数据`);
throw new HttpException('Excel 文件中没有有效数据', HttpStatus.BAD_REQUEST);
}
// 调用预览 API
this.logger.log(`[POST /batch-mining/upload-preview] 调用 mining-service 预览 API...`);
const preview = await this.batchMiningService.preview(items);
this.logger.log(`[POST /batch-mining/upload-preview] 预览成功, 总金额: ${preview.grandTotalAmount}`);
return {
...preview,
parsedItems: items,
originalFileName: file.originalname,
};
} catch (error) {
if (error instanceof HttpException) {
throw error;
}
this.logger.error(`[POST /batch-mining/upload-preview] 解析 Excel 文件失败:`, error);
throw new HttpException(
`解析 Excel 文件失败: ${error instanceof Error ? error.message : error}`,
HttpStatus.BAD_REQUEST,
);
}
}
@Post('preview')
@ApiOperation({ summary: '预览批量补发(传入解析后的数据)' })
@ApiBody({
schema: {
type: 'object',
required: ['items'],
properties: {
items: {
type: 'array',
items: {
type: 'object',
properties: {
accountSequence: { type: 'string' },
treeCount: { type: 'number' },
miningStartDate: { type: 'string' },
batch: { type: 'number' },
preMineDays: { type: 'number' },
remark: { type: 'string' },
},
},
},
},
},
})
async preview(@Body() body: { items: BatchMiningItem[] }) {
this.logger.log(`[POST /batch-mining/preview] 请求预览, 数据条数: ${body.items?.length || 0}`);
if (!body.items || body.items.length === 0) {
this.logger.error(`[POST /batch-mining/preview] 数据为空`);
throw new HttpException('数据不能为空', HttpStatus.BAD_REQUEST);
}
try {
const result = await this.batchMiningService.preview(body.items);
this.logger.log(`[POST /batch-mining/preview] 预览成功`);
return result;
} catch (error) {
this.logger.error(`[POST /batch-mining/preview] 错误:`, error);
throw error;
}
}
@Post('upload-execute')
@ApiOperation({ summary: '上传 Excel 文件并执行批量补发(只能执行一次)' })
@ApiConsumes('multipart/form-data')
@ApiBody({
schema: {
type: 'object',
required: ['file', 'reason'],
properties: {
file: {
type: 'string',
format: 'binary',
description: 'Excel 文件 (.xlsx)',
},
reason: {
type: 'string',
description: '补发原因(必填)',
},
},
},
})
@UseInterceptors(FileInterceptor('file'))
async uploadAndExecute(
@UploadedFile() file: Express.Multer.File,
@Body() body: { reason: string },
@Req() req: any,
) {
this.logger.log(`[POST /batch-mining/upload-execute] 开始处理上传执行请求`);
if (!file) {
this.logger.error(`[POST /batch-mining/upload-execute] 未收到文件`);
throw new HttpException('请上传文件', HttpStatus.BAD_REQUEST);
}
this.logger.log(`[POST /batch-mining/upload-execute] 收到文件: ${file.originalname}, 原因: ${body.reason}`);
if (!body.reason || body.reason.trim().length === 0) {
this.logger.error(`[POST /batch-mining/upload-execute] 补发原因为空`);
throw new HttpException('补发原因不能为空', HttpStatus.BAD_REQUEST);
}
// 检查文件类型
const validTypes = [
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
'application/vnd.ms-excel',
];
if (!validTypes.includes(file.mimetype) && !file.originalname.endsWith('.xlsx')) {
this.logger.error(`[POST /batch-mining/upload-execute] 文件类型不正确: ${file.mimetype}`);
throw new HttpException('请上传 Excel 文件 (.xlsx)', HttpStatus.BAD_REQUEST);
}
try {
// 解析 Excel
this.logger.log(`[POST /batch-mining/upload-execute] 开始解析 Excel...`);
const workbook = XLSX.read(file.buffer, { type: 'buffer' });
this.logger.log(`[POST /batch-mining/upload-execute] Excel Sheet 列表: ${workbook.SheetNames.join(', ')}`);
// 尝试读取 Sheet2如果存在
const actualSheetName = workbook.SheetNames.includes('Sheet2') ? 'Sheet2' : workbook.SheetNames[0];
const actualSheet = workbook.Sheets[actualSheetName];
this.logger.log(`[POST /batch-mining/upload-execute] 使用 Sheet: ${actualSheetName}`);
// 转换为数组
const rows: any[][] = XLSX.utils.sheet_to_json(actualSheet, { header: 1 });
this.logger.log(`[POST /batch-mining/upload-execute] Excel 总行数: ${rows.length}`);
// 解析数据
const items = this.batchMiningService.parseExcelData(rows);
this.logger.log(`[POST /batch-mining/upload-execute] 解析后有效数据: ${items.length}`);
if (items.length === 0) {
this.logger.error(`[POST /batch-mining/upload-execute] Excel 文件中没有有效数据`);
throw new HttpException('Excel 文件中没有有效数据', HttpStatus.BAD_REQUEST);
}
const admin = req.admin;
this.logger.log(`[POST /batch-mining/upload-execute] 操作管理员: ${admin?.username} (${admin?.id})`);
// 调用执行 API
this.logger.log(`[POST /batch-mining/upload-execute] 调用 mining-service 执行 API...`);
const result = await this.batchMiningService.execute(
{
items,
operatorId: admin.id,
operatorName: admin.username,
reason: body.reason,
},
admin.id,
);
this.logger.log(`[POST /batch-mining/upload-execute] 执行成功: successCount=${result.successCount}, totalAmount=${result.totalAmount}`);
return {
...result,
originalFileName: file.originalname,
};
} catch (error) {
if (error instanceof HttpException) {
throw error;
}
this.logger.error(`[POST /batch-mining/upload-execute] 执行失败:`, error);
throw new HttpException(
`执行失败: ${error instanceof Error ? error.message : error}`,
HttpStatus.BAD_REQUEST,
);
}
}
@Post('execute')
@ApiOperation({ summary: '执行批量补发(传入解析后的数据,只能执行一次)' })
@ApiBody({
schema: {
type: 'object',
required: ['items', 'reason'],
properties: {
items: {
type: 'array',
items: {
type: 'object',
properties: {
accountSequence: { type: 'string' },
treeCount: { type: 'number' },
miningStartDate: { type: 'string' },
batch: { type: 'number' },
preMineDays: { type: 'number' },
remark: { type: 'string' },
},
},
},
reason: { type: 'string', description: '补发原因(必填)' },
},
},
})
async execute(
@Body() body: { items: BatchMiningItem[]; reason: string },
@Req() req: any,
) {
this.logger.log(`[POST /batch-mining/execute] 请求执行批量补发`);
this.logger.log(`[POST /batch-mining/execute] 数据条数: ${body.items?.length || 0}, 原因: ${body.reason}`);
if (!body.items || body.items.length === 0) {
this.logger.error(`[POST /batch-mining/execute] 数据为空`);
throw new HttpException('数据不能为空', HttpStatus.BAD_REQUEST);
}
if (!body.reason || body.reason.trim().length === 0) {
this.logger.error(`[POST /batch-mining/execute] 补发原因为空`);
throw new HttpException('补发原因不能为空', HttpStatus.BAD_REQUEST);
}
const admin = req.admin;
this.logger.log(`[POST /batch-mining/execute] 操作管理员: ${admin?.username} (${admin?.id})`);
try {
const result = await this.batchMiningService.execute(
{
items: body.items,
operatorId: admin.id,
operatorName: admin.username,
reason: body.reason,
},
admin.id,
);
this.logger.log(`[POST /batch-mining/execute] 执行成功`);
return result;
} catch (error) {
this.logger.error(`[POST /batch-mining/execute] 错误:`, error);
throw error;
}
}
@Get('execution')
@ApiOperation({ summary: '获取批量补发执行记录(含明细)' })
async getExecution() {
this.logger.log(`[GET /batch-mining/execution] 请求获取执行记录`);
try {
const execution = await this.batchMiningService.getExecution();
if (!execution) {
this.logger.log(`[GET /batch-mining/execution] 尚未执行过批量补发`);
throw new HttpException('尚未执行过批量补发', HttpStatus.NOT_FOUND);
}
this.logger.log(`[GET /batch-mining/execution] 返回执行记录: id=${execution.id}`);
return execution;
} catch (error) {
if (error instanceof HttpException) {
throw error;
}
this.logger.error(`[GET /batch-mining/execution] 错误:`, error);
throw error;
}
}
}

View File

@ -1,5 +1,6 @@
import { Controller, Get, Post, Delete, Body, Param, Query, Req } from '@nestjs/common'; import { Controller, Get, Post, Delete, Body, Param, Query, Req, Logger } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiBearerAuth, ApiQuery, ApiParam } from '@nestjs/swagger'; import { ApiTags, ApiOperation, ApiBearerAuth, ApiQuery, ApiParam } from '@nestjs/swagger';
import { ConfigService } from '@nestjs/config';
import { ConfigManagementService } from '../../application/services/config.service'; import { ConfigManagementService } from '../../application/services/config.service';
class SetConfigDto { category: string; key: string; value: string; description?: string; } class SetConfigDto { category: string; key: string; value: string; description?: string; }
@ -8,7 +9,12 @@ class SetConfigDto { category: string; key: string; value: string; description?:
@ApiBearerAuth() @ApiBearerAuth()
@Controller('configs') @Controller('configs')
export class ConfigController { export class ConfigController {
constructor(private readonly configService: ConfigManagementService) {} private readonly logger = new Logger(ConfigController.name);
constructor(
private readonly configService: ConfigManagementService,
private readonly appConfigService: ConfigService,
) {}
@Get() @Get()
@ApiOperation({ summary: '获取配置列表' }) @ApiOperation({ summary: '获取配置列表' })
@ -17,6 +23,155 @@ export class ConfigController {
return this.configService.getConfigs(category); return this.configService.getConfigs(category);
} }
@Get('transfer-enabled')
@ApiOperation({ summary: '获取划转开关状态' })
async getTransferEnabled() {
const config = await this.configService.getConfig('system', 'transfer_enabled');
return { enabled: config?.configValue === 'true' };
}
@Post('transfer-enabled')
@ApiOperation({ summary: '设置划转开关状态' })
async setTransferEnabled(@Body() body: { enabled: boolean }, @Req() req: any) {
await this.configService.setConfig(req.admin.id, 'system', 'transfer_enabled', String(body.enabled), '划转开关');
return { success: true };
}
@Get('mining/status')
@ApiOperation({ summary: '获取挖矿状态' })
async getMiningStatus() {
const miningServiceUrl = this.appConfigService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
const contributionServiceUrl = this.appConfigService.get<string>('CONTRIBUTION_SERVICE_URL', 'http://localhost:3020');
this.logger.log(`Fetching mining status from ${miningServiceUrl}/api/v2/admin/status`);
try {
// 并行获取 mining-service 状态和 contribution-service 统计数据
const [miningResponse, contributionResponse] = await Promise.all([
fetch(`${miningServiceUrl}/api/v2/admin/status`),
fetch(`${contributionServiceUrl}/api/v2/contribution/stats`).catch(() => null),
]);
if (!miningResponse.ok) {
throw new Error(`Failed to fetch mining status: ${miningResponse.status}`);
}
const miningResult = await miningResponse.json();
this.logger.log(`Mining service response: ${JSON.stringify(miningResult)}`);
const miningData = miningResult.data || miningResult;
// 获取 contribution-service 的全网理论算力
let networkTotalContribution: string | null = null;
let userEffectiveContribution: string | null = null;
let systemAccountsContribution: string | null = null;
if (contributionResponse && contributionResponse.ok) {
const contributionResult = await contributionResponse.json();
const data = contributionResult.data || contributionResult;
// 全网理论算力 = 总认种树 × 每棵树算力
networkTotalContribution = data.networkTotalContribution || null;
// 用户有效算力
userEffectiveContribution = data.totalContribution || null;
// 系统账户算力
const systemAccounts = data.systemAccounts || [];
const systemTotal = systemAccounts
.filter((a: any) => a.accountType !== 'HEADQUARTERS')
.reduce((sum: number, a: any) => sum + parseFloat(a.totalContribution || '0'), 0);
systemAccountsContribution = systemTotal.toString();
}
// mining-service 中的全网理论算力
const miningNetworkTotal = miningData.networkTotalContribution || '0';
// mining-service 中的用户有效算力
const miningUserTotal = miningData.totalContribution || '0';
// 判断算力是否同步完成
// 核心条件全网理论算力已同步mining-service 的 networkTotalContribution 与 contribution-service 相近)
// 全网理论算力是挖矿分母,必须同步后才能正确计算挖矿比例
const networkSynced = networkTotalContribution !== null &&
parseFloat(networkTotalContribution) > 0 &&
parseFloat(miningNetworkTotal) > 0 &&
Math.abs(parseFloat(miningNetworkTotal) - parseFloat(networkTotalContribution)) / parseFloat(networkTotalContribution) < 0.001;
const isSynced = networkSynced;
return {
...miningData,
contributionSyncStatus: {
isSynced,
// 全网理论算力(应作为挖矿分母)
networkTotalContribution: networkTotalContribution || '0',
miningNetworkTotal,
// 用户有效算力
userEffectiveContribution: userEffectiveContribution || '0',
miningUserTotal,
// 系统账户算力
systemAccountsContribution: systemAccountsContribution || '0',
// 兼容旧字段
miningTotal: miningUserTotal,
contributionTotal: userEffectiveContribution || '0',
},
};
} catch (error) {
this.logger.error('Failed to get mining status', error);
return {
initialized: false,
isActive: false,
error: `Unable to connect to mining service: ${error.message}`,
contributionSyncStatus: {
isSynced: false,
networkTotalContribution: '0',
miningNetworkTotal: '0',
userEffectiveContribution: '0',
miningUserTotal: '0',
systemAccountsContribution: '0',
miningTotal: '0',
contributionTotal: '0',
},
};
}
}
@Post('mining/activate')
@ApiOperation({ summary: '激活挖矿' })
async activateMining(@Req() req: any) {
const miningServiceUrl = this.appConfigService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
try {
const response = await fetch(`${miningServiceUrl}/api/v2/admin/activate`, {
method: 'POST',
});
if (!response.ok) {
throw new Error('Failed to activate mining');
}
const result = await response.json();
this.logger.log(`Mining activated by admin ${req.admin?.id}`);
return result;
} catch (error) {
this.logger.error('Failed to activate mining', error);
return { success: false, message: 'Failed to activate mining' };
}
}
@Post('mining/deactivate')
@ApiOperation({ summary: '停用挖矿' })
async deactivateMining(@Req() req: any) {
const miningServiceUrl = this.appConfigService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
try {
const response = await fetch(`${miningServiceUrl}/api/v2/admin/deactivate`, {
method: 'POST',
});
if (!response.ok) {
throw new Error('Failed to deactivate mining');
}
const result = await response.json();
this.logger.log(`Mining deactivated by admin ${req.admin?.id}`);
return result;
} catch (error) {
this.logger.error('Failed to deactivate mining', error);
return { success: false, message: 'Failed to deactivate mining' };
}
}
@Get(':category/:key') @Get(':category/:key')
@ApiOperation({ summary: '获取单个配置' }) @ApiOperation({ summary: '获取单个配置' })
@ApiParam({ name: 'category' }) @ApiParam({ name: 'category' })

View File

@ -16,19 +16,105 @@ export class DashboardController {
@Get() @Get()
@ApiOperation({ summary: '获取仪表盘统计数据' }) @ApiOperation({ summary: '获取仪表盘统计数据' })
async getStats() { async getStats() {
return this.dashboardService.getDashboardStats(); const raw = await this.dashboardService.getDashboardStats();
// 计算24小时价格变化
let priceChange24h = 0;
if (raw.latestPrice) {
const open = parseFloat(raw.latestPrice.open) || 1;
const close = parseFloat(raw.latestPrice.close) || 1;
priceChange24h = (close - open) / open;
}
// 详细算力分解数据
const dc = raw.detailedContribution || {};
// 转换为前端期望的格式
// 优先使用远程服务数据,因为 CDC 同步可能不完整
const remoteData = raw.remoteData || {};
return {
// 基础统计
totalUsers: raw.users?.total || 0,
adoptedUsers: raw.users?.adopted || 0,
totalTrees: raw.contribution?.totalTrees || 0,
networkEffectiveContribution: raw.contribution?.effectiveContribution || '0',
networkTotalContribution: raw.contribution?.totalContribution || '0',
networkLevelPending: dc.levelContribution?.pending || '0',
networkBonusPending: dc.bonusContribution?.pending || '0',
// 已分配积分股:优先使用远程数据
totalDistributed: remoteData.totalDistributed || raw.mining?.totalMined || '0',
// 已销毁积分股:优先使用远程数据
totalBurned: remoteData.totalBurned || raw.mining?.latestDailyStat?.totalBurned || '0',
// 流通池:优先使用远程数据
circulationPool: remoteData.circulationPool || raw.trading?.circulationPool?.totalShares || '0',
currentPrice: raw.latestPrice?.close || '1',
priceChange24h,
totalOrders: raw.trading?.totalAccounts || 0,
totalTrades: raw.trading?.totalAccounts || 0,
// ========== 详细算力分解 ==========
detailedContribution: {
totalTrees: dc.totalTrees || 0,
// 全网算力(理论值)= 总树数 * 22617
networkTotalTheory: dc.networkTotalTheory || '0',
// 个人算力70%
personalTheory: dc.personalTheory || '0',
personalActual: raw.contribution?.personalContribution || '0',
// 运营账户12%
operationTheory: dc.operationTheory || '0',
operationActual: dc.operationActual || '0',
// 省公司1%
provinceTheory: dc.provinceTheory || '0',
provinceActual: dc.provinceActual || '0',
// 市公司2%
cityTheory: dc.cityTheory || '0',
cityActual: dc.cityActual || '0',
// 层级算力7.5%
level: {
theory: dc.levelTheory || '0',
unlocked: dc.levelContribution?.unlocked || '0',
pending: dc.levelContribution?.pending || '0',
// 分档详情
tier1: dc.levelContribution?.byTier?.tier1 || { unlocked: '0', pending: '0' },
tier2: dc.levelContribution?.byTier?.tier2 || { unlocked: '0', pending: '0' },
tier3: dc.levelContribution?.byTier?.tier3 || { unlocked: '0', pending: '0' },
},
// 团队奖励算力7.5%
bonus: {
theory: dc.bonusTheory || '0',
unlocked: dc.bonusContribution?.unlocked || '0',
pending: dc.bonusContribution?.pending || '0',
// 分档详情
tier1: dc.bonusContribution?.byTier?.tier1 || { unlocked: '0', pending: '0' },
tier2: dc.bonusContribution?.byTier?.tier2 || { unlocked: '0', pending: '0' },
tier3: dc.bonusContribution?.byTier?.tier3 || { unlocked: '0', pending: '0' },
},
},
};
} }
@Get('stats') @Get('stats')
@ApiOperation({ summary: '获取仪表盘统计数据(别名)' }) @ApiOperation({ summary: '获取仪表盘统计数据(别名)' })
async getStatsAlias() { async getStatsAlias() {
return this.dashboardService.getDashboardStats(); return this.getStats();
} }
@Get('realtime') @Get('realtime')
@ApiOperation({ summary: '获取实时数据' }) @ApiOperation({ summary: '获取实时数据' })
async getRealtimeStats() { async getRealtimeStats() {
return this.dashboardService.getRealtimeStats(); const raw = await this.dashboardService.getRealtimeStats();
// 转换为前端期望的格式
return {
currentMinuteDistribution: raw.minuteDistribution || '0',
currentMinuteBurn: '0', // 暂无实时销毁数据
activeOrders: 0, // 暂无实时订单数据
pendingTrades: 0, // 暂无待处理交易数据
lastPriceUpdateAt: raw.timestamp,
};
} }
@Get('reports') @Get('reports')

View File

@ -1,77 +0,0 @@
import { Controller, Post, Body, Req } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiBearerAuth } from '@nestjs/swagger';
import { InitializationService } from '../../application/services/initialization.service';
class InitMiningConfigDto {
totalShares: string;
distributionPool: string;
halvingPeriodYears: number;
burnTarget: string;
}
@ApiTags('Initialization')
@ApiBearerAuth()
@Controller('initialization')
export class InitializationController {
constructor(private readonly initService: InitializationService) {}
@Post('mining-config')
@ApiOperation({ summary: '初始化挖矿配置' })
async initMiningConfig(@Body() dto: InitMiningConfigDto, @Req() req: any) {
return this.initService.initializeMiningConfig(req.admin.id, dto);
}
@Post('system-accounts')
@ApiOperation({ summary: '初始化系统账户' })
async initSystemAccounts(@Req() req: any) {
return this.initService.initializeSystemAccounts(req.admin.id);
}
@Post('activate-mining')
@ApiOperation({ summary: '激活挖矿' })
async activateMining(@Req() req: any) {
return this.initService.activateMining(req.admin.id);
}
@Post('sync-users')
@ApiOperation({ summary: '同步所有用户数据从auth-service初始同步' })
async syncUsers(@Req() req: any) {
return this.initService.syncAllUsers(req.admin.id);
}
@Post('sync-contribution-accounts')
@ApiOperation({ summary: '同步所有算力账户从contribution-service初始同步' })
async syncContributionAccounts(@Req() req: any) {
return this.initService.syncAllContributionAccounts(req.admin.id);
}
@Post('sync-mining-accounts')
@ApiOperation({ summary: '同步所有挖矿账户从mining-service初始同步' })
async syncMiningAccounts(@Req() req: any) {
return this.initService.syncAllMiningAccounts(req.admin.id);
}
@Post('sync-trading-accounts')
@ApiOperation({ summary: '同步所有交易账户从trading-service初始同步' })
async syncTradingAccounts(@Req() req: any) {
return this.initService.syncAllTradingAccounts(req.admin.id);
}
@Post('sync-all')
@ApiOperation({ summary: '执行完整的数据同步(用户+算力+挖矿+交易)' })
async syncAll(@Req() req: any) {
const adminId = req.admin.id;
const results = {
users: await this.initService.syncAllUsers(adminId),
contribution: await this.initService.syncAllContributionAccounts(adminId),
mining: await this.initService.syncAllMiningAccounts(adminId),
trading: await this.initService.syncAllTradingAccounts(adminId),
};
return {
success: true,
message: '全部同步完成',
details: results,
};
}
}

View File

@ -0,0 +1,116 @@
import {
Controller,
Get,
Post,
Body,
Query,
Param,
HttpException,
HttpStatus,
Req,
} from '@nestjs/common';
import {
ApiTags,
ApiOperation,
ApiBearerAuth,
ApiBody,
ApiQuery,
ApiParam,
} from '@nestjs/swagger';
import { ManualMiningService } from '../../application/services/manual-mining.service';
@ApiTags('Manual Mining')
@ApiBearerAuth()
@Controller('manual-mining')
export class ManualMiningController {
constructor(private readonly manualMiningService: ManualMiningService) {}
@Post('calculate')
@ApiOperation({ summary: '计算手工补发挖矿预估金额' })
@ApiBody({
schema: {
type: 'object',
required: ['accountSequence', 'adoptionDate'],
properties: {
accountSequence: { type: 'string', description: '用户账户序列号' },
adoptionDate: {
type: 'string',
format: 'date',
description: '认种日期 (YYYY-MM-DD)',
},
},
},
})
async calculate(
@Body() body: { accountSequence: string; adoptionDate: string },
) {
if (!body.accountSequence || !body.adoptionDate) {
throw new HttpException('账户序列号和认种日期不能为空', HttpStatus.BAD_REQUEST);
}
return this.manualMiningService.calculate(body);
}
@Post('execute')
@ApiOperation({ summary: '执行手工补发挖矿(仅超级管理员)' })
@ApiBody({
schema: {
type: 'object',
required: ['accountSequence', 'adoptionDate', 'reason'],
properties: {
accountSequence: { type: 'string', description: '用户账户序列号' },
adoptionDate: {
type: 'string',
format: 'date',
description: '认种日期 (YYYY-MM-DD)',
},
reason: { type: 'string', description: '补发原因(必填)' },
},
},
})
async execute(
@Body() body: { accountSequence: string; adoptionDate: string; reason: string },
@Req() req: any,
) {
if (!body.accountSequence || !body.adoptionDate) {
throw new HttpException('账户序列号和认种日期不能为空', HttpStatus.BAD_REQUEST);
}
if (!body.reason || body.reason.trim().length === 0) {
throw new HttpException('补发原因不能为空', HttpStatus.BAD_REQUEST);
}
const admin = req.admin;
return this.manualMiningService.execute(
{
accountSequence: body.accountSequence,
adoptionDate: body.adoptionDate,
operatorId: admin.id,
operatorName: admin.username,
reason: body.reason,
},
admin.id,
);
}
@Get('records')
@ApiOperation({ summary: '获取手工补发记录列表' })
@ApiQuery({ name: 'page', required: false, type: Number })
@ApiQuery({ name: 'pageSize', required: false, type: Number })
async getRecords(
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
) {
return this.manualMiningService.getRecords(page ?? 1, pageSize ?? 20);
}
@Get('records/:accountSequence')
@ApiOperation({ summary: '查询指定用户的手工补发记录' })
@ApiParam({ name: 'accountSequence', type: String })
async getRecordByAccount(@Param('accountSequence') accountSequence: string) {
const record =
await this.manualMiningService.getRecordByAccountSequence(accountSequence);
if (!record) {
throw new HttpException('该用户没有手工补发记录', HttpStatus.NOT_FOUND);
}
return record;
}
}

View File

@ -0,0 +1,77 @@
import { Controller, Get, Param, Query } from '@nestjs/common';
import {
ApiTags,
ApiOperation,
ApiBearerAuth,
ApiParam,
ApiQuery,
} from '@nestjs/swagger';
import { PendingContributionsService } from '../../application/services/pending-contributions.service';
@ApiTags('Pending Contributions')
@ApiBearerAuth()
@Controller('pending-contributions')
export class PendingContributionsController {
constructor(
private readonly pendingContributionsService: PendingContributionsService,
) {}
@Get()
@ApiOperation({ summary: '获取待解锁算力列表' })
@ApiQuery({ name: 'page', required: false, type: Number })
@ApiQuery({ name: 'pageSize', required: false, type: Number })
@ApiQuery({
name: 'contributionType',
required: false,
type: String,
description: '算力类型筛选',
})
async getPendingContributions(
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
@Query('contributionType') contributionType?: string,
) {
return this.pendingContributionsService.getPendingContributions(
page ?? 1,
pageSize ?? 20,
contributionType,
);
}
@Get('summary')
@ApiOperation({ summary: '获取待解锁算力汇总统计' })
async getPendingContributionsSummary() {
return this.pendingContributionsService.getPendingContributionsSummary();
}
@Get('mining-records')
@ApiOperation({ summary: '获取所有待解锁算力的挖矿记录' })
@ApiQuery({ name: 'page', required: false, type: Number })
@ApiQuery({ name: 'pageSize', required: false, type: Number })
async getAllPendingMiningRecords(
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
) {
return this.pendingContributionsService.getAllPendingMiningRecords(
page ?? 1,
pageSize ?? 20,
);
}
@Get(':id/records')
@ApiOperation({ summary: '获取某条待解锁算力的挖矿记录' })
@ApiParam({ name: 'id', type: String, description: '待解锁算力ID' })
@ApiQuery({ name: 'page', required: false, type: Number })
@ApiQuery({ name: 'pageSize', required: false, type: Number })
async getPendingContributionMiningRecords(
@Param('id') id: string,
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
) {
return this.pendingContributionsService.getPendingContributionMiningRecords(
id,
page ?? 1,
pageSize ?? 20,
);
}
}

View File

@ -0,0 +1,59 @@
import { Controller, Get, Query } from '@nestjs/common';
import {
ApiTags,
ApiOperation,
ApiBearerAuth,
ApiQuery,
} from '@nestjs/swagger';
import { DashboardService } from '../../application/services/dashboard.service';
@ApiTags('Reports')
@ApiBearerAuth()
@Controller('reports')
export class ReportsController {
constructor(private readonly dashboardService: DashboardService) {}
@Get('daily')
@ApiOperation({ summary: '获取每日报表' })
@ApiQuery({ name: 'page', required: false, type: Number })
@ApiQuery({ name: 'pageSize', required: false, type: Number })
@ApiQuery({ name: 'days', required: false, type: Number })
async getDailyReports(
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
@Query('days') days?: number,
) {
const result = await this.dashboardService.getReports(
page ?? 1,
pageSize ?? 30,
);
// 转换为前端期望的格式
return {
items: result.data.map((report: any) => ({
id: report.id,
reportDate: report.reportDate,
totalUsers: report.users?.total || 0,
newUsers: report.users?.new || 0,
adoptedUsers: report.adoptions?.total || 0,
newAdoptedUsers: report.adoptions?.new || 0,
totalContribution: report.contribution?.total || '0',
newContribution: report.contribution?.growth || '0',
totalDistributed: report.mining?.distributed || '0',
dailyDistributed: report.mining?.distributed || '0',
totalBurned: report.mining?.burned || '0',
dailyBurned: report.mining?.burned || '0',
openPrice: report.price?.open || '1',
closePrice: report.price?.close || '1',
highPrice: report.price?.high || '1',
lowPrice: report.price?.low || '1',
totalVolume: report.trading?.volume || '0',
dailyVolume: report.trading?.volume || '0',
})),
total: result.total,
page: result.pagination.page,
pageSize: result.pagination.pageSize,
totalPages: result.pagination.totalPages,
};
}
}

View File

@ -1,5 +1,5 @@
import { Controller, Get } from '@nestjs/common'; import { Controller, Get, Param, Query } from '@nestjs/common';
import { ApiTags, ApiOperation, ApiBearerAuth } from '@nestjs/swagger'; import { ApiTags, ApiOperation, ApiBearerAuth, ApiParam, ApiQuery } from '@nestjs/swagger';
import { SystemAccountsService } from '../../application/services/system-accounts.service'; import { SystemAccountsService } from '../../application/services/system-accounts.service';
@ApiTags('System Accounts') @ApiTags('System Accounts')
@ -19,4 +19,89 @@ export class SystemAccountsController {
async getSystemAccountsSummary() { async getSystemAccountsSummary() {
return this.systemAccountsService.getSystemAccountsSummary(); return this.systemAccountsService.getSystemAccountsSummary();
} }
@Get(':accountType/records')
@ApiOperation({ summary: '获取系统账户挖矿记录' })
@ApiParam({ name: 'accountType', type: String, description: '系统账户类型OPERATION/PROVINCE/CITY/HEADQUARTERS' })
@ApiQuery({ name: 'regionCode', required: false, type: String, description: '区域代码(省/市代码)' })
@ApiQuery({ name: 'page', required: false, type: Number })
@ApiQuery({ name: 'pageSize', required: false, type: Number })
async getSystemAccountMiningRecords(
@Param('accountType') accountType: string,
@Query('regionCode') regionCode?: string,
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
) {
return this.systemAccountsService.getSystemAccountMiningRecords(
accountType,
regionCode || null,
page ?? 1,
pageSize ?? 20,
);
}
@Get(':accountType/transactions')
@ApiOperation({ summary: '获取系统账户交易记录' })
@ApiParam({ name: 'accountType', type: String, description: '系统账户类型OPERATION/PROVINCE/CITY/HEADQUARTERS' })
@ApiQuery({ name: 'regionCode', required: false, type: String, description: '区域代码(省/市代码)' })
@ApiQuery({ name: 'page', required: false, type: Number })
@ApiQuery({ name: 'pageSize', required: false, type: Number })
async getSystemAccountTransactions(
@Param('accountType') accountType: string,
@Query('regionCode') regionCode?: string,
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
) {
return this.systemAccountsService.getSystemAccountTransactions(
accountType,
regionCode || null,
page ?? 1,
pageSize ?? 20,
);
}
@Get(':accountType/contributions')
@ApiOperation({
summary: '获取系统账户算力来源明细',
description: '显示该账户的每笔算力来自哪个认种订单',
})
@ApiParam({
name: 'accountType',
type: String,
description: '系统账户类型OPERATION/PROVINCE/CITY/HEADQUARTERS',
})
@ApiQuery({ name: 'regionCode', required: false, type: String, description: '区域代码(省/市代码)' })
@ApiQuery({ name: 'page', required: false, type: Number, description: '页码默认1' })
@ApiQuery({ name: 'pageSize', required: false, type: Number, description: '每页数量默认20' })
async getSystemAccountContributionRecords(
@Param('accountType') accountType: string,
@Query('regionCode') regionCode?: string,
@Query('page') page?: number,
@Query('pageSize') pageSize?: number,
) {
return this.systemAccountsService.getSystemAccountContributionRecords(
accountType,
regionCode || null,
page ?? 1,
pageSize ?? 20,
);
}
@Get(':accountType/contribution-stats')
@ApiOperation({
summary: '获取系统账户算力明细统计',
description: '显示算力来源的汇总信息,包括记录数、来源认种订单数、来源用户数等',
})
@ApiParam({
name: 'accountType',
type: String,
description: '系统账户类型OPERATION/PROVINCE/CITY/HEADQUARTERS',
})
@ApiQuery({ name: 'regionCode', required: false, type: String, description: '区域代码(省/市代码)' })
async getSystemAccountContributionStats(
@Param('accountType') accountType: string,
@Query('regionCode') regionCode?: string,
) {
return this.systemAccountsService.getSystemAccountContributionStats(accountType, regionCode || null);
}
} }

View File

@ -12,7 +12,11 @@ import { AdminAuthGuard } from './shared/guards/admin-auth.guard';
imports: [ imports: [
ConfigModule.forRoot({ ConfigModule.forRoot({
isGlobal: true, isGlobal: true,
envFilePath: [`.env.${process.env.NODE_ENV || 'development'}`, '.env'], envFilePath: [
`.env.${process.env.NODE_ENV || 'development'}`,
'.env',
'../.env', // 父目录共享 .env
],
}), }),
InfrastructureModule, InfrastructureModule,
ApplicationModule, ApplicationModule,

View File

@ -2,28 +2,37 @@ import { Module, OnModuleInit } from '@nestjs/common';
import { InfrastructureModule } from '../infrastructure/infrastructure.module'; import { InfrastructureModule } from '../infrastructure/infrastructure.module';
import { AuthService } from './services/auth.service'; import { AuthService } from './services/auth.service';
import { ConfigManagementService } from './services/config.service'; import { ConfigManagementService } from './services/config.service';
import { InitializationService } from './services/initialization.service';
import { DashboardService } from './services/dashboard.service'; import { DashboardService } from './services/dashboard.service';
import { UsersService } from './services/users.service'; import { UsersService } from './services/users.service';
import { SystemAccountsService } from './services/system-accounts.service'; import { SystemAccountsService } from './services/system-accounts.service';
import { DailyReportService } from './services/daily-report.service';
import { ManualMiningService } from './services/manual-mining.service';
import { PendingContributionsService } from './services/pending-contributions.service';
import { BatchMiningService } from './services/batch-mining.service';
@Module({ @Module({
imports: [InfrastructureModule], imports: [InfrastructureModule],
providers: [ providers: [
AuthService, AuthService,
ConfigManagementService, ConfigManagementService,
InitializationService,
DashboardService, DashboardService,
UsersService, UsersService,
SystemAccountsService, SystemAccountsService,
DailyReportService,
ManualMiningService,
PendingContributionsService,
BatchMiningService,
], ],
exports: [ exports: [
AuthService, AuthService,
ConfigManagementService, ConfigManagementService,
InitializationService,
DashboardService, DashboardService,
UsersService, UsersService,
SystemAccountsService, SystemAccountsService,
DailyReportService,
ManualMiningService,
PendingContributionsService,
BatchMiningService,
], ],
}) })
export class ApplicationModule implements OnModuleInit { export class ApplicationModule implements OnModuleInit {

View File

@ -0,0 +1,377 @@
import { Injectable, Logger, HttpException, HttpStatus } from '@nestjs/common';
import { ConfigService } from '@nestjs/config';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
/**
* Excel
*/
export interface BatchMiningItem {
accountSequence: string; // 注册ID (用户账号序列号)
treeCount: number; // 认种量(棵)
miningStartDate: string; // 挖矿开始时间
batch: number; // 批次号
preMineDays: number; // 授权提前挖的天数(该批次比后续批次提前的天数)
totalMiningDays: number; // 总挖矿天数(从挖矿开始日期到今天)
remark?: string; // 备注
}
/**
*
*/
export interface BatchMiningRequest {
items: BatchMiningItem[];
operatorId: string;
operatorName: string;
reason: string;
}
/**
* -
* mining-service API
*/
@Injectable()
export class BatchMiningService {
private readonly logger = new Logger(BatchMiningService.name);
private readonly miningServiceUrl: string;
constructor(
private readonly prisma: PrismaService,
private readonly configService: ConfigService,
) {
this.miningServiceUrl = this.configService.get<string>(
'MINING_SERVICE_URL',
'http://localhost:3021',
);
}
/**
*
*/
async getStatus(): Promise<any> {
const url = `${this.miningServiceUrl}/api/v2/admin/batch-mining/status`;
this.logger.log(`[getStatus] 开始获取批量补发状态, URL: ${url}`);
try {
this.logger.log(`[getStatus] 发送 GET 请求...`);
const response = await fetch(url, {
method: 'GET',
headers: { 'Content-Type': 'application/json' },
});
this.logger.log(`[getStatus] 响应状态码: ${response.status}`);
const result = await response.json();
this.logger.log(`[getStatus] 响应数据: ${JSON.stringify(result)}`);
if (!response.ok) {
this.logger.error(`[getStatus] 请求失败: ${result.message || '未知错误'}`);
throw new HttpException(
result.message || '获取状态失败',
response.status,
);
}
// mining-service 使用 TransformInterceptor 包装响应为 { success, data, timestamp }
const data = result.data || result;
this.logger.log(`[getStatus] 成功获取状态: hasExecuted=${data.hasExecuted}`);
return data;
} catch (error) {
if (error instanceof HttpException) {
throw error;
}
this.logger.error(`[getStatus] 调用 mining-service 失败:`, error);
throw new HttpException(
`调用 mining-service 失败: ${error instanceof Error ? error.message : error}`,
HttpStatus.INTERNAL_SERVER_ERROR,
);
}
}
/**
*
*/
async preview(items: BatchMiningItem[]): Promise<any> {
const url = `${this.miningServiceUrl}/api/v2/admin/batch-mining/preview`;
this.logger.log(`[preview] 开始预览批量补发, URL: ${url}`);
this.logger.log(`[preview] 数据条数: ${items.length}`);
this.logger.log(`[preview] 前3条数据: ${JSON.stringify(items.slice(0, 3))}`);
try {
this.logger.log(`[preview] 发送 POST 请求...`);
const response = await fetch(url, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ items }),
});
this.logger.log(`[preview] 响应状态码: ${response.status}`);
const result = await response.json();
if (!response.ok) {
this.logger.error(`[preview] 请求失败: ${result.message || '未知错误'}`);
throw new HttpException(
result.message || '预览失败',
response.status,
);
}
// mining-service 使用 TransformInterceptor 包装响应为 { success, data, timestamp }
const data = result.data || result;
this.logger.log(`[preview] 响应数据概要: totalBatches=${data.totalBatches}, totalUsers=${data.totalUsers}, grandTotalAmount=${data.grandTotalAmount}`);
this.logger.log(`[preview] 预览成功`);
return data;
} catch (error) {
if (error instanceof HttpException) {
throw error;
}
this.logger.error(`[preview] 调用 mining-service 失败:`, error);
throw new HttpException(
`调用 mining-service 失败: ${error instanceof Error ? error.message : error}`,
HttpStatus.INTERNAL_SERVER_ERROR,
);
}
}
/**
*
*/
async execute(
request: BatchMiningRequest,
adminId: string,
): Promise<any> {
const url = `${this.miningServiceUrl}/api/v2/admin/batch-mining/execute`;
this.logger.log(`[execute] 开始执行批量补发, URL: ${url}`);
this.logger.log(`[execute] 操作人: ${request.operatorName} (${request.operatorId})`);
this.logger.log(`[execute] 原因: ${request.reason}`);
this.logger.log(`[execute] 数据条数: ${request.items.length}`);
try {
this.logger.log(`[execute] 发送 POST 请求...`);
const response = await fetch(url, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(request),
});
this.logger.log(`[execute] 响应状态码: ${response.status}`);
const result = await response.json();
this.logger.log(`[execute] 响应数据: ${JSON.stringify(result)}`);
if (!response.ok) {
this.logger.error(`[execute] 请求失败: ${result.message || '未知错误'}`);
throw new HttpException(
result.message || '执行失败',
response.status,
);
}
// mining-service 使用 TransformInterceptor 包装响应为 { success, data, timestamp }
const data = result.data || result;
// 记录审计日志
this.logger.log(`[execute] 记录审计日志...`);
await this.prisma.auditLog.create({
data: {
adminId,
action: 'CREATE',
resource: 'BATCH_MINING',
resourceId: data.batchId,
newValue: {
totalUsers: data.totalUsers,
successCount: data.successCount,
failedCount: data.failedCount,
totalAmount: data.totalAmount,
reason: request.reason,
},
},
});
this.logger.log(
`[execute] 批量补发执行成功: admin=${adminId}, total=${data.totalUsers}, success=${data.successCount}, amount=${data.totalAmount}`,
);
return data;
} catch (error) {
if (error instanceof HttpException) {
throw error;
}
this.logger.error(`[execute] 调用 mining-service 失败:`, error);
throw new HttpException(
`调用 mining-service 失败: ${error instanceof Error ? error.message : error}`,
HttpStatus.INTERNAL_SERVER_ERROR,
);
}
}
/**
*
*/
async getExecution(): Promise<any> {
const url = `${this.miningServiceUrl}/api/v2/admin/batch-mining/execution`;
this.logger.log(`[getExecution] 开始获取执行记录, URL: ${url}`);
try {
this.logger.log(`[getExecution] 发送 GET 请求...`);
const response = await fetch(url, {
method: 'GET',
headers: { 'Content-Type': 'application/json' },
});
this.logger.log(`[getExecution] 响应状态码: ${response.status}`);
if (response.status === 404) {
this.logger.log(`[getExecution] 未找到执行记录 (404)`);
return null;
}
const result = await response.json();
if (!response.ok) {
this.logger.error(`[getExecution] 请求失败: ${result.message || '未知错误'}`);
throw new HttpException(
result.message || '获取记录失败',
response.status,
);
}
// mining-service 使用 TransformInterceptor 包装响应为 { success, data, timestamp }
const data = result.data || result;
this.logger.log(`[getExecution] 响应数据概要: id=${data.id}, totalUsers=${data.totalUsers}`);
this.logger.log(`[getExecution] 成功获取执行记录`);
return data;
} catch (error) {
if (error instanceof HttpException) {
throw error;
}
this.logger.error(`[getExecution] 调用 mining-service 失败:`, error);
throw new HttpException(
`调用 mining-service 失败: ${error instanceof Error ? error.message : error}`,
HttpStatus.INTERNAL_SERVER_ERROR,
);
}
}
/**
* Excel
* Excel :
* ID | | | | | |
*/
parseExcelData(rows: any[]): BatchMiningItem[] {
this.logger.log(`[parseExcelData] 开始解析 Excel 数据, 总行数: ${rows.length}`);
const items: BatchMiningItem[] = [];
const today = new Date();
today.setHours(0, 0, 0, 0);
// 打印前5行原始数据用于调试
this.logger.log(`[parseExcelData] 前5行原始数据:`);
for (let i = 0; i < Math.min(5, rows.length); i++) {
this.logger.log(`[parseExcelData] 行${i}: ${JSON.stringify(rows[i])}`);
}
for (let i = 0; i < rows.length; i++) {
const row = rows[i];
// 跳过空行
if (!row || !row[0]) {
this.logger.debug(`[parseExcelData] 跳过行 ${i + 1}: 空行`);
continue;
}
// 跳过标题行
const firstCell = String(row[0]).trim();
if (firstCell === '用户ID' || firstCell === '注册ID' || firstCell === '序号') {
this.logger.debug(`[parseExcelData] 跳过行 ${i + 1}: 标题行`);
continue;
}
// Excel 格式:序号 | 注册ID | 认种量(棵)| 挖矿开始时间 | 批次 | 授权提前挖的天数 | 备注
// 索引: 0 1 2 3 4 5 6
// 获取用户ID (第二列索引1)
let accountSequence = String(row[1]).trim();
if (!accountSequence.startsWith('D')) {
accountSequence = 'D' + accountSequence;
}
// 获取认种量 (第三列索引2)
const treeCount = parseInt(row[2], 10);
if (isNaN(treeCount) || treeCount <= 0) {
this.logger.debug(`[parseExcelData] 跳过行 ${i + 1}: 认种量无效 (${row[2]})`);
continue;
}
// 获取挖矿开始时间 (第四列索引3)
const miningStartDateStr = String(row[3] || '').trim();
// 解析挖矿开始时间,计算总挖矿天数
const miningStartDate = this.parseDate(miningStartDateStr);
let totalMiningDays = 0;
if (miningStartDate) {
const diffTime = today.getTime() - miningStartDate.getTime();
totalMiningDays = Math.floor(diffTime / (1000 * 60 * 60 * 24));
}
// 获取批次 (第五列索引4)
const batch = parseInt(row[4], 10);
if (isNaN(batch) || batch <= 0) {
this.logger.warn(`[parseExcelData] 跳过行 ${i + 1}: 批次无效 (${row[4]})`);
continue;
}
// 获取授权提前挖的天数 (第六列索引5)
const preMineDays = parseInt(row[5], 10);
if (isNaN(preMineDays) || preMineDays <= 0) {
this.logger.warn(`[parseExcelData] 跳过行 ${i + 1}: 授权提前挖的天数无效 (${row[5]})`);
continue;
}
// 获取备注 (第七列索引6)
const remark = row[6] ? String(row[6]).trim() : undefined;
items.push({
accountSequence,
treeCount,
miningStartDate: miningStartDateStr,
batch,
preMineDays,
totalMiningDays,
remark,
});
}
this.logger.log(`[parseExcelData] 解析完成, 有效数据: ${items.length}`);
if (items.length > 0) {
this.logger.log(`[parseExcelData] 第一条数据: ${JSON.stringify(items[0])}`);
this.logger.log(`[parseExcelData] 最后一条数据: ${JSON.stringify(items[items.length - 1])}`);
}
return items;
}
/**
*
* 支持格式: 2025.11.8, 2025-11-08, 2025/11/8
*/
private parseDate(dateStr: string): Date | null {
if (!dateStr) return null;
const formats = [
/^(\d{4})\.(\d{1,2})\.(\d{1,2})$/, // 2025.11.8
/^(\d{4})-(\d{1,2})-(\d{1,2})$/, // 2025-11-08
/^(\d{4})\/(\d{1,2})\/(\d{1,2})$/, // 2025/11/8
];
for (const format of formats) {
const match = dateStr.match(format);
if (match) {
const year = parseInt(match[1], 10);
const month = parseInt(match[2], 10) - 1;
const day = parseInt(match[3], 10);
const date = new Date(year, month, day);
date.setHours(0, 0, 0, 0);
return date;
}
}
return null;
}
}

View File

@ -0,0 +1,264 @@
import { Injectable, Logger, OnModuleInit } from '@nestjs/common';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
import Decimal from 'decimal.js';
@Injectable()
export class DailyReportService implements OnModuleInit {
private readonly logger = new Logger(DailyReportService.name);
private reportInterval: NodeJS.Timeout | null = null;
constructor(private readonly prisma: PrismaService) {}
async onModuleInit() {
// 启动时先生成一次报表
await this.generateTodayReport();
// 每小时检查并更新当日报表
this.reportInterval = setInterval(
() => this.generateTodayReport(),
60 * 60 * 1000, // 1 hour
);
this.logger.log('Daily report service initialized');
}
/**
*
*/
async generateTodayReport(): Promise<void> {
const today = new Date();
today.setHours(0, 0, 0, 0);
try {
this.logger.log(`Generating daily report for ${today.toISOString().split('T')[0]}`);
// 收集各项统计数据
const [
userStats,
adoptionStats,
contributionStats,
miningStats,
tradingStats,
priceStats,
] = await Promise.all([
this.getUserStats(today),
this.getAdoptionStats(today),
this.getContributionStats(today),
this.getMiningStats(),
this.getTradingStats(today),
this.getPriceStats(today),
]);
// 更新或创建今日报表
await this.prisma.dailyReport.upsert({
where: { reportDate: today },
create: {
reportDate: today,
...userStats,
...adoptionStats,
...contributionStats,
...miningStats,
...tradingStats,
...priceStats,
},
update: {
...userStats,
...adoptionStats,
...contributionStats,
...miningStats,
...tradingStats,
...priceStats,
},
});
this.logger.log(`Daily report generated successfully for ${today.toISOString().split('T')[0]}`);
} catch (error) {
this.logger.error('Failed to generate daily report', error);
}
}
/**
*
*/
async generateHistoricalReport(date: Date): Promise<void> {
const reportDate = new Date(date);
reportDate.setHours(0, 0, 0, 0);
const [
userStats,
adoptionStats,
contributionStats,
miningStats,
tradingStats,
priceStats,
] = await Promise.all([
this.getUserStats(reportDate),
this.getAdoptionStats(reportDate),
this.getContributionStats(reportDate),
this.getMiningStats(),
this.getTradingStats(reportDate),
this.getPriceStats(reportDate),
]);
await this.prisma.dailyReport.upsert({
where: { reportDate },
create: {
reportDate,
...userStats,
...adoptionStats,
...contributionStats,
...miningStats,
...tradingStats,
...priceStats,
},
update: {
...userStats,
...adoptionStats,
...contributionStats,
...miningStats,
...tradingStats,
...priceStats,
},
});
}
/**
*
*/
private async getUserStats(date: Date) {
const nextDay = new Date(date);
nextDay.setDate(nextDay.getDate() + 1);
const [totalUsers, newUsers] = await Promise.all([
this.prisma.syncedUser.count({
where: { createdAt: { lt: nextDay } },
}),
this.prisma.syncedUser.count({
where: {
createdAt: { gte: date, lt: nextDay },
},
}),
]);
// 活跃用户暂时用总用户数(需要有活跃度跟踪才能准确计算)
const activeUsers = totalUsers;
return {
totalUsers,
newUsers,
activeUsers,
};
}
/**
*
*/
private async getAdoptionStats(date: Date) {
const nextDay = new Date(date);
nextDay.setDate(nextDay.getDate() + 1);
const [totalAdoptions, newAdoptions, treesResult] = await Promise.all([
this.prisma.syncedAdoption.count({
where: { adoptionDate: { lt: nextDay } },
}),
this.prisma.syncedAdoption.count({
where: {
adoptionDate: { gte: date, lt: nextDay },
},
}),
this.prisma.syncedAdoption.aggregate({
where: { adoptionDate: { lt: nextDay } },
_sum: { treeCount: true },
}),
]);
return {
totalAdoptions,
newAdoptions,
totalTrees: treesResult._sum.treeCount || 0,
};
}
/**
*
*/
private async getContributionStats(date: Date) {
// 获取全网算力进度
const networkProgress = await this.prisma.syncedNetworkProgress.findFirst();
// 获取用户算力汇总
const userContribution = await this.prisma.syncedContributionAccount.aggregate({
_sum: {
totalContribution: true,
effectiveContribution: true,
},
});
const totalContribution = new Decimal(
userContribution._sum.totalContribution?.toString() || '0',
);
// 获取昨日报表计算增长
const yesterday = new Date(date);
yesterday.setDate(yesterday.getDate() - 1);
const yesterdayReport = await this.prisma.dailyReport.findUnique({
where: { reportDate: yesterday },
});
const contributionGrowth = yesterdayReport
? totalContribution.minus(new Decimal(yesterdayReport.totalContribution.toString()))
: totalContribution;
return {
totalContribution,
contributionGrowth: contributionGrowth.gt(0) ? contributionGrowth : new Decimal(0),
};
}
/**
*
*/
private async getMiningStats() {
const dailyStat = await this.prisma.syncedDailyMiningStat.findFirst({
orderBy: { statDate: 'desc' },
});
return {
totalDistributed: dailyStat?.totalDistributed || new Decimal(0),
totalBurned: dailyStat?.totalBurned || new Decimal(0),
};
}
/**
*
*/
private async getTradingStats(date: Date) {
const kline = await this.prisma.syncedDayKLine.findUnique({
where: { klineDate: date },
});
return {
tradingVolume: kline?.volume || new Decimal(0),
tradingAmount: kline?.amount || new Decimal(0),
tradeCount: kline?.tradeCount || 0,
};
}
/**
*
*/
private async getPriceStats(date: Date) {
const kline = await this.prisma.syncedDayKLine.findUnique({
where: { klineDate: date },
});
const defaultPrice = new Decimal(1);
return {
openPrice: kline?.open || defaultPrice,
closePrice: kline?.close || defaultPrice,
highPrice: kline?.high || defaultPrice,
lowPrice: kline?.low || defaultPrice,
};
}
}

View File

@ -1,10 +1,30 @@
import { Injectable, Logger } from '@nestjs/common'; import { Injectable, Logger } from '@nestjs/common';
import { ConfigService } from '@nestjs/config'; import { ConfigService } from '@nestjs/config';
import { Decimal } from 'decimal.js';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service'; import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
// 基准算力常量
const BASE_CONTRIBUTION_PER_TREE = new Decimal('22617');
const RATE_PERSONAL = new Decimal('0.70');
const RATE_OPERATION = new Decimal('0.12');
const RATE_PROVINCE = new Decimal('0.01');
const RATE_CITY = new Decimal('0.02');
const RATE_LEVEL_TOTAL = new Decimal('0.075');
const RATE_BONUS_TOTAL = new Decimal('0.075');
// 远程服务数据缓存
interface RemoteServiceData {
totalDistributed: string;
totalBurned: string;
circulationPool: string;
fetchedAt: Date;
}
@Injectable() @Injectable()
export class DashboardService { export class DashboardService {
private readonly logger = new Logger(DashboardService.name); private readonly logger = new Logger(DashboardService.name);
private remoteDataCache: RemoteServiceData | null = null;
private readonly CACHE_TTL_MS = 30000; // 30秒缓存
constructor( constructor(
private readonly prisma: PrismaService, private readonly prisma: PrismaService,
@ -23,6 +43,8 @@ export class DashboardService {
tradingStats, tradingStats,
latestReport, latestReport,
latestKLine, latestKLine,
detailedContributionStats,
remoteData,
] = await Promise.all([ ] = await Promise.all([
this.getUserStats(), this.getUserStats(),
this.getContributionStats(), this.getContributionStats(),
@ -30,13 +52,42 @@ export class DashboardService {
this.getTradingStats(), this.getTradingStats(),
this.prisma.dailyReport.findFirst({ orderBy: { reportDate: 'desc' } }), this.prisma.dailyReport.findFirst({ orderBy: { reportDate: 'desc' } }),
this.prisma.syncedDayKLine.findFirst({ orderBy: { klineDate: 'desc' } }), this.prisma.syncedDayKLine.findFirst({ orderBy: { klineDate: 'desc' } }),
this.getDetailedContributionStats(),
this.fetchRemoteServiceData(),
]); ]);
// 合并远程服务数据如果本地数据为空或为0则使用远程数据
const totalMined = miningStats.totalMined !== '0'
? miningStats.totalMined
: remoteData.totalDistributed;
const totalBurned = miningStats.latestDailyStat?.totalBurned || remoteData.totalBurned;
const circulationPoolShares = tradingStats.circulationPool?.totalShares !== '0'
? tradingStats.circulationPool?.totalShares
: remoteData.circulationPool;
return { return {
users: userStats, users: userStats,
contribution: contributionStats, contribution: contributionStats,
mining: miningStats, mining: {
trading: tradingStats, ...miningStats,
totalMined, // 使用合并后的已分配数据
},
trading: {
...tradingStats,
circulationPool: {
totalShares: circulationPoolShares || '0',
totalCash: tradingStats.circulationPool?.totalCash || '0',
},
},
// 直接提供远程数据用于仪表盘显示
remoteData: {
totalDistributed: remoteData.totalDistributed,
totalBurned: remoteData.totalBurned,
circulationPool: remoteData.circulationPool,
},
detailedContribution: detailedContributionStats,
latestReport: latestReport latestReport: latestReport
? this.formatDailyReport(latestReport) ? this.formatDailyReport(latestReport)
: null, : null,
@ -110,39 +161,302 @@ export class DashboardService {
/** /**
* *
*
* = (70%) + (12%) + (1%) + (2%)
* + + + +
* =
* = * 22617
*/ */
private async getContributionStats() { private async getContributionStats() {
const accounts = await this.prisma.syncedContributionAccount.aggregate({ const [accounts, systemContributions, adoptionStats] = await Promise.all([
_sum: { this.prisma.syncedContributionAccount.aggregate({
totalContribution: true, _sum: {
effectiveContribution: true, totalContribution: true,
personalContribution: true, effectiveContribution: true,
teamLevelContribution: true, personalContribution: true,
teamBonusContribution: true, teamLevelContribution: true,
}, teamBonusContribution: true,
_count: true, },
}); _count: true,
}),
const systemContributions = this.prisma.syncedSystemContribution.aggregate({
await this.prisma.syncedSystemContribution.aggregate({
_sum: { contributionBalance: true }, _sum: { contributionBalance: true },
_count: true, _count: true,
}); }),
this.prisma.syncedAdoption.aggregate({
where: { status: 'MINING_ENABLED' },
_sum: { treeCount: true },
_count: true,
}),
]);
const totalTrees = adoptionStats._sum.treeCount || 0;
// 有效算力 = 理论总算力 = 总树数 * 22617
// 因为按照公式有效算力包含所有部分个人70%+运营12%+省1%+市2%+层级7.5%+团队7.5%=100%
const effectiveContribution = BASE_CONTRIBUTION_PER_TREE.mul(totalTrees);
// 个人算力(已分配到用户账户)
const personalContribution = new Decimal(accounts._sum.personalContribution || 0);
// 系统账户算力(运营+省+市)
const systemContribution = new Decimal(systemContributions._sum.contributionBalance || 0);
return { return {
totalAccounts: accounts._count, totalAccounts: accounts._count,
totalContribution: accounts._sum.totalContribution?.toString() || '0', totalContribution: accounts._sum.totalContribution?.toString() || '0',
effectiveContribution: effectiveContribution: effectiveContribution.toString(),
accounts._sum.effectiveContribution?.toString() || '0', personalContribution: personalContribution.toString(),
personalContribution:
accounts._sum.personalContribution?.toString() || '0',
teamLevelContribution: teamLevelContribution:
accounts._sum.teamLevelContribution?.toString() || '0', accounts._sum.teamLevelContribution?.toString() || '0',
teamBonusContribution: teamBonusContribution:
accounts._sum.teamBonusContribution?.toString() || '0', accounts._sum.teamBonusContribution?.toString() || '0',
systemAccounts: systemContributions._count, systemAccounts: systemContributions._count,
systemContribution: systemContribution: systemContribution.toString(),
systemContributions._sum.contributionBalance?.toString() || '0', totalAdoptions: adoptionStats._count,
totalTrees,
};
}
/**
*
* contribution-service API pending
* API 退
*/
private async getDetailedContributionStats() {
// 尝试从 contribution-service 获取完整数据
const contributionServiceData = await this.fetchContributionServiceStats();
if (contributionServiceData) {
return contributionServiceData;
}
// 回退:从本地同步数据计算
return this.getDetailedContributionStatsFromLocal();
}
/**
* contribution-service API
*/
private async fetchContributionServiceStats(): Promise<any | null> {
const contributionServiceUrl = this.configService.get<string>(
'CONTRIBUTION_SERVICE_URL',
'http://localhost:3020',
);
try {
const response = await fetch(`${contributionServiceUrl}/api/v2/contribution/stats`);
if (!response.ok) {
this.logger.warn(`Contribution service returned ${response.status}`);
return null;
}
const result = await response.json();
const data = result.data || result;
// 获取系统账户实际值(本地数据)
const systemAccounts = await this.prisma.syncedSystemContribution.findMany();
let operationActual = new Decimal(0);
let provinceActual = new Decimal(0);
let cityActual = new Decimal(0);
for (const account of systemAccounts) {
const balance = new Decimal(account.contributionBalance || 0);
if (account.accountType === 'OPERATION') operationActual = operationActual.plus(balance);
else if (account.accountType === 'PROVINCE') provinceActual = provinceActual.plus(balance);
else if (account.accountType === 'CITY') cityActual = cityActual.plus(balance);
}
return {
totalTrees: data.totalTrees || 0,
// 理论值
networkTotalTheory: data.networkTotalContribution || '0',
personalTheory: data.personalTotalContribution || '0',
operationTheory: data.operationTotalContribution || '0',
provinceTheory: data.provinceTotalContribution || '0',
cityTheory: data.cityTotalContribution || '0',
levelTheory: data.levelContribution?.total || '0',
bonusTheory: data.bonusContribution?.total || '0',
// 实际值
operationActual: operationActual.toString(),
provinceActual: provinceActual.toString(),
cityActual: cityActual.toString(),
// 层级算力详情(包含正确的 pending 数据)
levelContribution: {
total: data.levelContribution?.total || '0',
unlocked: data.levelContribution?.unlocked || '0',
pending: data.levelContribution?.pending || '0',
byTier: {
tier1: {
unlocked: data.levelContribution?.byTier?.tier1?.unlocked || '0',
pending: data.levelContribution?.byTier?.tier1?.pending || '0',
},
tier2: {
unlocked: data.levelContribution?.byTier?.tier2?.unlocked || '0',
pending: data.levelContribution?.byTier?.tier2?.pending || '0',
},
tier3: {
unlocked: data.levelContribution?.byTier?.tier3?.unlocked || '0',
pending: data.levelContribution?.byTier?.tier3?.pending || '0',
},
},
},
// 团队奖励算力详情(包含正确的 pending 数据)
bonusContribution: {
total: data.bonusContribution?.total || '0',
unlocked: data.bonusContribution?.unlocked || '0',
pending: data.bonusContribution?.pending || '0',
byTier: {
tier1: {
unlocked: data.bonusContribution?.byTier?.tier1?.unlocked || '0',
pending: data.bonusContribution?.byTier?.tier1?.pending || '0',
},
tier2: {
unlocked: data.bonusContribution?.byTier?.tier2?.unlocked || '0',
pending: data.bonusContribution?.byTier?.tier2?.pending || '0',
},
tier3: {
unlocked: data.bonusContribution?.byTier?.tier3?.unlocked || '0',
pending: data.bonusContribution?.byTier?.tier3?.pending || '0',
},
},
},
};
} catch (error) {
this.logger.warn(`Failed to fetch contribution service stats: ${error.message}`);
return null;
}
}
/**
* 退
*/
private async getDetailedContributionStatsFromLocal() {
// 获取总树数
const adoptionStats = await this.prisma.syncedAdoption.aggregate({
where: { status: 'MINING_ENABLED' },
_sum: { treeCount: true },
});
const totalTrees = adoptionStats._sum.treeCount || 0;
// 按层级统计已分配的层级算力
const levelRecords = await this.prisma.syncedContributionRecord.groupBy({
by: ['levelDepth'],
where: {
sourceType: 'TEAM_LEVEL',
levelDepth: { not: null },
},
_sum: { amount: true },
});
// 按档位统计已分配的团队奖励算力
const bonusRecords = await this.prisma.syncedContributionRecord.groupBy({
by: ['bonusTier'],
where: {
sourceType: 'TEAM_BONUS',
bonusTier: { not: null },
},
_sum: { amount: true },
});
// 获取系统账户按类型的算力
const systemAccounts = await this.prisma.syncedSystemContribution.findMany();
// 汇总层级1-5, 6-10, 11-15
let levelTier1 = new Decimal(0);
let levelTier2 = new Decimal(0);
let levelTier3 = new Decimal(0);
for (const record of levelRecords) {
const depth = record.levelDepth!;
const amount = new Decimal(record._sum.amount || 0);
if (depth >= 1 && depth <= 5) levelTier1 = levelTier1.plus(amount);
else if (depth >= 6 && depth <= 10) levelTier2 = levelTier2.plus(amount);
else if (depth >= 11 && depth <= 15) levelTier3 = levelTier3.plus(amount);
}
// 汇总团队奖励档位
let bonusTier1 = new Decimal(0);
let bonusTier2 = new Decimal(0);
let bonusTier3 = new Decimal(0);
for (const record of bonusRecords) {
const tier = record.bonusTier!;
const amount = new Decimal(record._sum.amount || 0);
if (tier === 1) bonusTier1 = amount;
else if (tier === 2) bonusTier2 = amount;
else if (tier === 3) bonusTier3 = amount;
}
const levelUnlocked = levelTier1.plus(levelTier2).plus(levelTier3);
const bonusUnlocked = bonusTier1.plus(bonusTier2).plus(bonusTier3);
// 计算理论值
const networkTotal = BASE_CONTRIBUTION_PER_TREE.mul(totalTrees);
const personalTheory = networkTotal.mul(RATE_PERSONAL);
const operationTheory = networkTotal.mul(RATE_OPERATION);
const provinceTheory = networkTotal.mul(RATE_PROVINCE);
const cityTheory = networkTotal.mul(RATE_CITY);
const levelTheory = networkTotal.mul(RATE_LEVEL_TOTAL);
const bonusTheory = networkTotal.mul(RATE_BONUS_TOTAL);
// 计算未解锁(理论 - 已解锁)- 仅用于总数,各档位无法获取
const levelPending = levelTheory.minus(levelUnlocked).greaterThan(0)
? levelTheory.minus(levelUnlocked)
: new Decimal(0);
const bonusPending = bonusTheory.minus(bonusUnlocked).greaterThan(0)
? bonusTheory.minus(bonusUnlocked)
: new Decimal(0);
// 系统账户按类型汇总
let operationActual = new Decimal(0);
let provinceActual = new Decimal(0);
let cityActual = new Decimal(0);
for (const account of systemAccounts) {
const balance = new Decimal(account.contributionBalance || 0);
if (account.accountType === 'OPERATION') operationActual = operationActual.plus(balance);
else if (account.accountType === 'PROVINCE') provinceActual = provinceActual.plus(balance);
else if (account.accountType === 'CITY') cityActual = cityActual.plus(balance);
}
return {
totalTrees,
// 理论值(基于总树数计算)
networkTotalTheory: networkTotal.toString(),
personalTheory: personalTheory.toString(),
operationTheory: operationTheory.toString(),
provinceTheory: provinceTheory.toString(),
cityTheory: cityTheory.toString(),
levelTheory: levelTheory.toString(),
bonusTheory: bonusTheory.toString(),
// 实际值(从数据库统计)
operationActual: operationActual.toString(),
provinceActual: provinceActual.toString(),
cityActual: cityActual.toString(),
// 层级算力详情(本地无法获取各档位 pending显示为 N/A
levelContribution: {
total: levelTheory.toString(),
unlocked: levelUnlocked.toString(),
pending: levelPending.toString(),
byTier: {
tier1: { unlocked: levelTier1.toString(), pending: 'N/A' },
tier2: { unlocked: levelTier2.toString(), pending: 'N/A' },
tier3: { unlocked: levelTier3.toString(), pending: 'N/A' },
},
},
// 团队奖励算力详情(本地无法获取各档位 pending显示为 N/A
bonusContribution: {
total: bonusTheory.toString(),
unlocked: bonusUnlocked.toString(),
pending: bonusPending.toString(),
byTier: {
tier1: { unlocked: bonusTier1.toString(), pending: 'N/A' },
tier2: { unlocked: bonusTier2.toString(), pending: 'N/A' },
tier3: { unlocked: bonusTier3.toString(), pending: 'N/A' },
},
},
}; };
} }
@ -295,6 +609,79 @@ export class DashboardService {
}; };
} }
// ===========================================================================
// 远程服务数据获取(实时数据备选方案)
// ===========================================================================
/**
* mining-service trading-service
* CDC
*/
private async fetchRemoteServiceData(): Promise<RemoteServiceData> {
// 检查缓存
if (
this.remoteDataCache &&
Date.now() - this.remoteDataCache.fetchedAt.getTime() < this.CACHE_TTL_MS
) {
return this.remoteDataCache;
}
const miningServiceUrl = this.configService.get<string>(
'MINING_SERVICE_URL',
'http://localhost:3021',
);
const tradingServiceUrl = this.configService.get<string>(
'TRADING_SERVICE_URL',
'http://localhost:3022',
);
let totalDistributed = '0';
let totalBurned = '0';
let circulationPool = '0';
try {
// 从 mining-service 获取已分配积分股
const miningResponse = await fetch(
`${miningServiceUrl}/api/v2/admin/status`,
);
if (miningResponse.ok) {
const miningResult = await miningResponse.json();
const miningData = miningResult.data || miningResult;
// 直接使用 totalDistributed所有用户 totalMined 的总和)
totalDistributed = miningData.totalDistributed || '0';
}
} catch (error) {
this.logger.warn(`Failed to fetch mining service data: ${error.message}`);
}
try {
// 从 trading-service 获取市场概览(包含销毁和流通池数据)
const marketResponse = await fetch(
`${tradingServiceUrl}/api/v2/asset/market`,
);
if (marketResponse.ok) {
const marketResult = await marketResponse.json();
const marketData = marketResult.data || marketResult;
// blackHoleAmount 是已销毁总量
totalBurned = marketData.blackHoleAmount || '0';
// circulationPool 是流通池余额
circulationPool = marketData.circulationPool || '0';
}
} catch (error) {
this.logger.warn(`Failed to fetch market overview: ${error.message}`);
}
// 更新缓存
this.remoteDataCache = {
totalDistributed,
totalBurned,
circulationPool,
fetchedAt: new Date(),
};
return this.remoteDataCache;
}
// =========================================================================== // ===========================================================================
// 辅助方法 // 辅助方法
// =========================================================================== // ===========================================================================

View File

@ -1,304 +0,0 @@
import { Injectable, Logger } from '@nestjs/common';
import { ConfigService } from '@nestjs/config';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
@Injectable()
export class InitializationService {
private readonly logger = new Logger(InitializationService.name);
constructor(
private readonly prisma: PrismaService,
private readonly configService: ConfigService,
) {}
async initializeMiningConfig(
adminId: string,
config: {
totalShares: string;
distributionPool: string;
halvingPeriodYears: number;
burnTarget: string;
},
): Promise<{ success: boolean; message: string }> {
const record = await this.prisma.initializationRecord.create({
data: { type: 'MINING_CONFIG', status: 'PENDING', config, executedBy: adminId },
});
try {
const miningServiceUrl = this.configService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
const response = await fetch(`${miningServiceUrl}/api/v1/admin/initialize`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(config),
});
if (!response.ok) {
throw new Error('Failed to initialize mining config');
}
await this.prisma.initializationRecord.update({
where: { id: record.id },
data: { status: 'COMPLETED', executedAt: new Date() },
});
await this.prisma.auditLog.create({
data: { adminId, action: 'INIT', resource: 'MINING', resourceId: record.id, newValue: config },
});
return { success: true, message: 'Mining config initialized successfully' };
} catch (error: any) {
await this.prisma.initializationRecord.update({
where: { id: record.id },
data: { status: 'FAILED', errorMessage: error.message },
});
return { success: false, message: error.message };
}
}
async initializeSystemAccounts(adminId: string): Promise<{ success: boolean; message: string }> {
const accounts = [
{ accountType: 'OPERATION', name: '运营账户', description: '12% 运营收入' },
{ accountType: 'PROVINCE', name: '省公司账户', description: '1% 省公司收入' },
{ accountType: 'CITY', name: '市公司账户', description: '2% 市公司收入' },
];
for (const account of accounts) {
await this.prisma.systemAccount.upsert({
where: { accountType: account.accountType },
create: account,
update: { name: account.name, description: account.description },
});
}
await this.prisma.auditLog.create({
data: { adminId, action: 'INIT', resource: 'SYSTEM_ACCOUNT', newValue: accounts },
});
return { success: true, message: 'System accounts initialized successfully' };
}
async activateMining(adminId: string): Promise<{ success: boolean; message: string }> {
try {
const miningServiceUrl = this.configService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
const response = await fetch(`${miningServiceUrl}/api/v1/admin/activate`, { method: 'POST' });
if (!response.ok) {
throw new Error('Failed to activate mining');
}
await this.prisma.auditLog.create({
data: { adminId, action: 'INIT', resource: 'MINING', newValue: { action: 'ACTIVATE' } },
});
return { success: true, message: 'Mining activated successfully' };
} catch (error: any) {
return { success: false, message: error.message };
}
}
async syncAllUsers(adminId: string): Promise<{ success: boolean; message: string; syncedCount?: number }> {
try {
const authServiceUrl = this.configService.get<string>('AUTH_SERVICE_URL', 'http://localhost:3024');
const response = await fetch(`${authServiceUrl}/api/v2/admin/users/sync`);
if (!response.ok) {
throw new Error(`Failed to fetch users: ${response.statusText}`);
}
const responseData = await response.json();
const users = responseData.data?.users || responseData.users || [];
let syncedCount = 0;
for (const user of users) {
try {
await this.prisma.syncedUser.upsert({
where: { accountSequence: user.accountSequence },
create: {
originalUserId: user.id || user.accountSequence,
accountSequence: user.accountSequence,
phone: user.phone,
status: user.status || 'ACTIVE',
kycStatus: user.kycStatus || 'PENDING',
realName: user.realName || null,
isLegacyUser: user.isLegacyUser || false,
createdAt: new Date(user.createdAt),
},
update: {
phone: user.phone,
status: user.status || 'ACTIVE',
kycStatus: user.kycStatus || 'PENDING',
realName: user.realName || null,
},
});
syncedCount++;
} catch (err) {
this.logger.warn(`Failed to sync user ${user.accountSequence}: ${err}`);
}
}
await this.prisma.auditLog.create({
data: { adminId, action: 'SYNC', resource: 'USER', newValue: { syncedCount } },
});
return { success: true, message: `Synced ${syncedCount} users`, syncedCount };
} catch (error: any) {
return { success: false, message: error.message };
}
}
async syncAllContributionAccounts(adminId: string): Promise<{ success: boolean; message: string; syncedCount?: number }> {
try {
const contributionServiceUrl = this.configService.get<string>('CONTRIBUTION_SERVICE_URL', 'http://localhost:3020');
const response = await fetch(`${contributionServiceUrl}/api/v2/admin/accounts/sync`);
if (!response.ok) {
throw new Error(`Failed to fetch accounts: ${response.statusText}`);
}
const responseData = await response.json();
const accounts = responseData.data?.accounts || responseData.accounts || [];
let syncedCount = 0;
for (const account of accounts) {
try {
await this.prisma.syncedContributionAccount.upsert({
where: { accountSequence: account.accountSequence },
create: {
accountSequence: account.accountSequence,
personalContribution: account.personalContribution || 0,
teamLevelContribution: account.teamLevelContribution || 0,
teamBonusContribution: account.teamBonusContribution || 0,
totalContribution: account.totalContribution || 0,
effectiveContribution: account.effectiveContribution || 0,
hasAdopted: account.hasAdopted || false,
directReferralCount: account.directReferralAdoptedCount || 0,
unlockedLevelDepth: account.unlockedLevelDepth || 0,
unlockedBonusTiers: account.unlockedBonusTiers || 0,
},
update: {
personalContribution: account.personalContribution,
teamLevelContribution: account.teamLevelContribution,
teamBonusContribution: account.teamBonusContribution,
totalContribution: account.totalContribution,
effectiveContribution: account.effectiveContribution,
hasAdopted: account.hasAdopted,
directReferralCount: account.directReferralAdoptedCount,
unlockedLevelDepth: account.unlockedLevelDepth,
unlockedBonusTiers: account.unlockedBonusTiers,
},
});
syncedCount++;
} catch (err) {
this.logger.warn(`Failed to sync account ${account.accountSequence}: ${err}`);
}
}
await this.prisma.auditLog.create({
data: { adminId, action: 'SYNC', resource: 'CONTRIBUTION_ACCOUNT', newValue: { syncedCount } },
});
return { success: true, message: `Synced ${syncedCount} accounts`, syncedCount };
} catch (error: any) {
return { success: false, message: error.message };
}
}
async syncAllMiningAccounts(adminId: string): Promise<{ success: boolean; message: string; syncedCount?: number }> {
try {
const miningServiceUrl = this.configService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
const response = await fetch(`${miningServiceUrl}/api/v1/admin/accounts/sync`);
if (!response.ok) {
throw new Error(`Failed to fetch accounts: ${response.statusText}`);
}
const responseData = await response.json();
const accounts = responseData.data?.accounts || responseData.accounts || [];
let syncedCount = 0;
for (const account of accounts) {
try {
await this.prisma.syncedMiningAccount.upsert({
where: { accountSequence: account.accountSequence },
create: {
accountSequence: account.accountSequence,
totalMined: account.totalMined || 0,
availableBalance: account.availableBalance || 0,
frozenBalance: account.frozenBalance || 0,
totalContribution: account.totalContribution || 0,
},
update: {
totalMined: account.totalMined,
availableBalance: account.availableBalance,
frozenBalance: account.frozenBalance,
totalContribution: account.totalContribution,
},
});
syncedCount++;
} catch (err) {
this.logger.warn(`Failed to sync mining account ${account.accountSequence}: ${err}`);
}
}
await this.prisma.auditLog.create({
data: { adminId, action: 'SYNC', resource: 'MINING_ACCOUNT', newValue: { syncedCount } },
});
return { success: true, message: `Synced ${syncedCount} mining accounts`, syncedCount };
} catch (error: any) {
return { success: false, message: error.message };
}
}
async syncAllTradingAccounts(adminId: string): Promise<{ success: boolean; message: string; syncedCount?: number }> {
try {
const tradingServiceUrl = this.configService.get<string>('TRADING_SERVICE_URL', 'http://localhost:3022');
const response = await fetch(`${tradingServiceUrl}/api/v1/admin/accounts/sync`);
if (!response.ok) {
throw new Error(`Failed to fetch accounts: ${response.statusText}`);
}
const responseData = await response.json();
const accounts = responseData.data?.accounts || responseData.accounts || [];
let syncedCount = 0;
for (const account of accounts) {
try {
await this.prisma.syncedTradingAccount.upsert({
where: { accountSequence: account.accountSequence },
create: {
accountSequence: account.accountSequence,
shareBalance: account.shareBalance || 0,
cashBalance: account.cashBalance || 0,
frozenShares: account.frozenShares || 0,
frozenCash: account.frozenCash || 0,
totalBought: account.totalBought || 0,
totalSold: account.totalSold || 0,
},
update: {
shareBalance: account.shareBalance,
cashBalance: account.cashBalance,
frozenShares: account.frozenShares,
frozenCash: account.frozenCash,
totalBought: account.totalBought,
totalSold: account.totalSold,
},
});
syncedCount++;
} catch (err) {
this.logger.warn(`Failed to sync trading account ${account.accountSequence}: ${err}`);
}
}
await this.prisma.auditLog.create({
data: { adminId, action: 'SYNC', resource: 'TRADING_ACCOUNT', newValue: { syncedCount } },
});
return { success: true, message: `Synced ${syncedCount} trading accounts`, syncedCount };
} catch (error: any) {
return { success: false, message: error.message };
}
}
}

View File

@ -0,0 +1,205 @@
import { Injectable, Logger, HttpException, HttpStatus } from '@nestjs/common';
import { ConfigService } from '@nestjs/config';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
export interface ManualMiningCalculateRequest {
accountSequence: string;
adoptionDate: string;
}
export interface ManualMiningExecuteRequest {
accountSequence: string;
adoptionDate: string;
operatorId: string;
operatorName: string;
reason: string;
}
/**
* -
* mining-service API
*/
@Injectable()
export class ManualMiningService {
private readonly logger = new Logger(ManualMiningService.name);
private readonly miningServiceUrl: string;
constructor(
private readonly prisma: PrismaService,
private readonly configService: ConfigService,
) {
this.miningServiceUrl = this.configService.get<string>(
'MINING_SERVICE_URL',
'http://localhost:3021',
);
}
/**
*
*/
async calculate(request: ManualMiningCalculateRequest): Promise<any> {
try {
const response = await fetch(
`${this.miningServiceUrl}/admin/manual-mining/calculate`,
{
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(request),
},
);
const result = await response.json();
if (!response.ok) {
throw new HttpException(
result.message || '计算失败',
response.status,
);
}
return result;
} catch (error) {
if (error instanceof HttpException) {
throw error;
}
this.logger.error('Failed to calculate manual mining', error);
throw new HttpException(
`调用 mining-service 失败: ${error instanceof Error ? error.message : error}`,
HttpStatus.INTERNAL_SERVER_ERROR,
);
}
}
/**
*
*/
async execute(
request: ManualMiningExecuteRequest,
adminId: string,
): Promise<any> {
try {
const response = await fetch(
`${this.miningServiceUrl}/admin/manual-mining/execute`,
{
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(request),
},
);
const result = await response.json();
if (!response.ok) {
throw new HttpException(
result.message || '执行失败',
response.status,
);
}
// 记录审计日志
await this.prisma.auditLog.create({
data: {
adminId,
action: 'CREATE',
resource: 'MANUAL_MINING',
resourceId: result.recordId,
newValue: {
accountSequence: request.accountSequence,
adoptionDate: request.adoptionDate,
amount: result.amount,
reason: request.reason,
},
},
});
this.logger.log(
`Manual mining executed by admin ${adminId}: account=${request.accountSequence}, amount=${result.amount}`,
);
return result;
} catch (error) {
if (error instanceof HttpException) {
throw error;
}
this.logger.error('Failed to execute manual mining', error);
throw new HttpException(
`调用 mining-service 失败: ${error instanceof Error ? error.message : error}`,
HttpStatus.INTERNAL_SERVER_ERROR,
);
}
}
/**
*
*/
async getRecords(page: number = 1, pageSize: number = 20): Promise<any> {
try {
const response = await fetch(
`${this.miningServiceUrl}/admin/manual-mining/records?page=${page}&pageSize=${pageSize}`,
{
method: 'GET',
headers: { 'Content-Type': 'application/json' },
},
);
const result = await response.json();
if (!response.ok) {
throw new HttpException(
result.message || '获取记录失败',
response.status,
);
}
return result;
} catch (error) {
if (error instanceof HttpException) {
throw error;
}
this.logger.error('Failed to get manual mining records', error);
throw new HttpException(
`调用 mining-service 失败: ${error instanceof Error ? error.message : error}`,
HttpStatus.INTERNAL_SERVER_ERROR,
);
}
}
/**
* accountSequence
*/
async getRecordByAccountSequence(accountSequence: string): Promise<any> {
try {
const response = await fetch(
`${this.miningServiceUrl}/admin/manual-mining/records/${accountSequence}`,
{
method: 'GET',
headers: { 'Content-Type': 'application/json' },
},
);
if (response.status === 404) {
return null;
}
const result = await response.json();
if (!response.ok) {
throw new HttpException(
result.message || '获取记录失败',
response.status,
);
}
return result;
} catch (error) {
if (error instanceof HttpException) {
throw error;
}
this.logger.error('Failed to get manual mining record', error);
throw new HttpException(
`调用 mining-service 失败: ${error instanceof Error ? error.message : error}`,
HttpStatus.INTERNAL_SERVER_ERROR,
);
}
}
}

View File

@ -0,0 +1,138 @@
import { Injectable, Logger } from '@nestjs/common';
import { HttpService } from '@nestjs/axios';
import { ConfigService } from '@nestjs/config';
import { firstValueFrom } from 'rxjs';
@Injectable()
export class PendingContributionsService {
private readonly logger = new Logger(PendingContributionsService.name);
constructor(
private readonly httpService: HttpService,
private readonly configService: ConfigService,
) {}
private getMiningServiceUrl(): string {
return this.configService.get<string>(
'MINING_SERVICE_URL',
'http://localhost:3021',
);
}
/**
*
*/
async getPendingContributions(
page: number = 1,
pageSize: number = 20,
contributionType?: string,
) {
const miningServiceUrl = this.getMiningServiceUrl();
try {
const params: any = { page, pageSize };
if (contributionType) {
params.contributionType = contributionType;
}
const response = await firstValueFrom(
this.httpService.get(`${miningServiceUrl}/admin/pending-contributions`, {
params,
}),
);
return response.data;
} catch (error) {
this.logger.warn(
`Failed to fetch pending contributions: ${error.message}`,
);
return { contributions: [], total: 0, page, pageSize };
}
}
/**
*
*/
async getPendingContributionsSummary() {
const miningServiceUrl = this.getMiningServiceUrl();
try {
const response = await firstValueFrom(
this.httpService.get(
`${miningServiceUrl}/admin/pending-contributions/summary`,
),
);
return response.data;
} catch (error) {
this.logger.warn(
`Failed to fetch pending contributions summary: ${error.message}`,
);
return {
byType: [],
total: { totalAmount: '0', count: 0 },
totalMinedToHeadquarters: '0',
};
}
}
/**
*
*/
async getPendingContributionMiningRecords(
id: string,
page: number = 1,
pageSize: number = 20,
) {
const miningServiceUrl = this.getMiningServiceUrl();
try {
const response = await firstValueFrom(
this.httpService.get(
`${miningServiceUrl}/admin/pending-contributions/${id}/records`,
{
params: { page, pageSize },
},
),
);
return response.data;
} catch (error) {
this.logger.warn(
`Failed to fetch pending contribution mining records: ${error.message}`,
);
return {
pendingContribution: null,
records: [],
total: 0,
page,
pageSize,
};
}
}
/**
*
*/
async getAllPendingMiningRecords(page: number = 1, pageSize: number = 20) {
const miningServiceUrl = this.getMiningServiceUrl();
try {
const response = await firstValueFrom(
this.httpService.get(
`${miningServiceUrl}/admin/pending-contributions/mining-records`,
{
params: { page, pageSize },
},
),
);
return response.data;
} catch (error) {
this.logger.warn(
`Failed to fetch all pending mining records: ${error.message}`,
);
return { records: [], total: 0, page, pageSize };
}
}
}

View File

@ -1,84 +1,246 @@
import { Injectable } from '@nestjs/common'; import { Injectable, Logger } from '@nestjs/common';
import { HttpService } from '@nestjs/axios';
import { ConfigService } from '@nestjs/config';
import { firstValueFrom } from 'rxjs';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service'; import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
interface MiningServiceSystemAccount {
id: string;
accountType: string; // OPERATION / PROVINCE / CITY / HEADQUARTERS
regionCode: string | null; // 省/市代码,如 440000, 440100
name: string;
totalMined: string;
availableBalance: string;
totalContribution: string;
lastSyncedAt: string | null;
}
interface MiningServiceResponse {
accounts: MiningServiceSystemAccount[];
total: number;
}
@Injectable() @Injectable()
export class SystemAccountsService { export class SystemAccountsService {
constructor(private readonly prisma: PrismaService) {} private readonly logger = new Logger(SystemAccountsService.name);
constructor(
private readonly prisma: PrismaService,
private readonly httpService: HttpService,
private readonly configService: ConfigService,
) {}
/**
* mining-service
*/
private async fetchMiningServiceSystemAccounts(): Promise<Map<string, MiningServiceSystemAccount>> {
const miningServiceUrl = this.configService.get<string>(
'MINING_SERVICE_URL',
'http://localhost:3021',
);
try {
const response = await firstValueFrom(
this.httpService.get<MiningServiceResponse>(
`${miningServiceUrl}/admin/system-accounts`,
),
);
const miningDataMap = new Map<string, MiningServiceSystemAccount>();
for (const account of response.data.accounts) {
// 使用 accountType:regionCode 作为 key与 contribution 表一致
const key = account.regionCode
? `${account.accountType}:${account.regionCode}`
: account.accountType;
miningDataMap.set(key, account);
}
return miningDataMap;
} catch (error) {
this.logger.warn(
`Failed to fetch mining service system accounts: ${error.message}`,
);
return new Map();
}
}
/** /**
* *
* synced_system_contributions
*
*/ */
async getSystemAccounts() { async getSystemAccounts() {
// 先从本地 SystemAccount 表获取 // 从 CDC 同步的 SyncedSystemContribution 获取算力数据(主要数据源)
const localAccounts = await this.prisma.systemAccount.findMany({ const syncedContributions = await this.prisma.syncedSystemContribution.findMany({
orderBy: { accountType: 'asc' }, orderBy: [{ accountType: 'asc' }, { regionCode: 'asc' }],
}); });
// 再从 CDC 同步的 SyncedSystemContribution 获取算力数据 // 从 CDC 同步的 SyncedWalletSystemAccount 表获取钱包数据
const syncedContributions = const syncedWalletAccounts = await this.prisma.syncedWalletSystemAccount.findMany();
await this.prisma.syncedSystemContribution.findMany();
// 合并数据 // 从 mining-service 获取挖矿数据
const accountsMap = new Map<string, any>(); const miningDataMap = await this.fetchMiningServiceSystemAccounts();
// 添加本地账户 // 获取省市名称映射
for (const account of localAccounts) { const regionNameMap = await this.buildRegionNameMap();
accountsMap.set(account.accountType, {
accountType: account.accountType, // 构建钱包数据映射
name: account.name, const walletMap = new Map<string, any>();
description: account.description, for (const wallet of syncedWalletAccounts) {
totalContribution: account.totalContribution.toString(), // 钱包账户的 code 格式为 "CITY-440100"、"PROVINCE-440000" 等
createdAt: account.createdAt, if (wallet.code) {
source: 'local', const regionCode = this.extractRegionCodeFromCode(wallet.code);
}); if (regionCode) {
const key = `${wallet.accountType}:${regionCode}`;
walletMap.set(key, wallet);
}
}
// 同时用 accountType 作为 key用于 OPERATION、HEADQUARTERS 等)
walletMap.set(wallet.accountType, wallet);
} }
// 更新或添加同步的算力数据 // 构建返回数据 - 以算力账户为主
for (const contrib of syncedContributions) { const accounts = syncedContributions.map((contrib) => {
const existing = accountsMap.get(contrib.accountType); const key = contrib.regionCode
if (existing) { ? `${contrib.accountType}:${contrib.regionCode}`
existing.contributionBalance = contrib.contributionBalance.toString(); : contrib.accountType;
existing.contributionNeverExpires = contrib.contributionNeverExpires;
existing.syncedAt = contrib.syncedAt; const wallet = walletMap.get(key) || walletMap.get(contrib.accountType);
existing.source = 'synced'; const miningData = miningDataMap.get(key) || miningDataMap.get(contrib.accountType);
} else {
accountsMap.set(contrib.accountType, { // 获取显示名称
accountType: contrib.accountType, const displayName = this.getDisplayName(contrib.accountType, contrib.regionCode, regionNameMap);
name: contrib.name,
contributionBalance: contrib.contributionBalance.toString(), return {
contributionNeverExpires: contrib.contributionNeverExpires, id: contrib.id,
syncedAt: contrib.syncedAt, accountType: contrib.accountType,
source: 'synced', regionCode: contrib.regionCode,
}); name: displayName,
code: wallet?.code || null,
provinceId: wallet?.provinceId || null,
cityId: wallet?.cityId || null,
// 钱包余额(如果有钱包账户)
shareBalance: wallet?.shareBalance?.toString() || '0',
usdtBalance: wallet?.usdtBalance?.toString() || '0',
greenPointBalance: wallet?.greenPointBalance?.toString() || '0',
frozenShare: wallet?.frozenShare?.toString() || '0',
frozenUsdt: wallet?.frozenUsdt?.toString() || '0',
totalInflow: wallet?.totalInflow?.toString() || '0',
totalOutflow: wallet?.totalOutflow?.toString() || '0',
blockchainAddress: wallet?.blockchainAddress || null,
isActive: wallet?.isActive ?? true,
// 算力数据
contributionBalance: contrib.contributionBalance?.toString() || '0',
contributionNeverExpires: contrib.contributionNeverExpires || false,
// 挖矿数据
totalMined: miningData?.totalMined || '0',
availableBalance: miningData?.availableBalance || '0',
miningContribution: miningData?.totalContribution || '0',
miningLastSyncedAt: miningData?.lastSyncedAt || null,
syncedAt: contrib.syncedAt,
source: 'contribution',
};
});
return {
accounts,
total: accounts.length,
};
}
/**
*
*/
private async buildRegionNameMap(): Promise<Map<string, string>> {
const [provinces, cities] = await Promise.all([
this.prisma.syncedProvince.findMany({ select: { code: true, name: true } }),
this.prisma.syncedCity.findMany({ select: { code: true, name: true } }),
]);
const map = new Map<string, string>();
for (const province of provinces) {
map.set(province.code, province.name);
}
for (const city of cities) {
map.set(city.code, city.name);
}
return map;
}
/**
*
*/
private getDisplayName(
accountType: string,
regionCode: string | null,
regionNameMap: Map<string, string>,
): string {
// 基础账户类型名称
const baseNames: Record<string, string> = {
OPERATION: '运营账户',
HEADQUARTERS: '总部账户',
PROVINCE: '省公司账户',
CITY: '市公司账户',
};
if (!regionCode) {
return baseNames[accountType] || accountType;
}
// 根据区域代码查找名称
const regionName = regionNameMap.get(regionCode);
if (regionName) {
if (accountType === 'PROVINCE') {
return `${regionName}省公司`;
} else if (accountType === 'CITY') {
return `${regionName}市公司`;
} }
} }
return { // 回退:使用区域代码
accounts: Array.from(accountsMap.values()), return `${regionCode}账户`;
total: accountsMap.size, }
};
/**
* code
* : "CITY-440100" -> "440100", "PROVINCE-440000" -> "440000"
* "HEADQUARTERS" null
*/
private extractRegionCodeFromCode(code: string): string | null {
if (!code) return null;
// 匹配 CITY-XXXXXX, PROVINCE-XXXXXX, PROV-XXXXXX 格式
const match = code.match(/^(?:CITY|PROVINCE|PROV)-(\d+)$/);
return match ? match[1] : null;
} }
/** /**
* *
*/ */
async getSystemAccountsSummary() { async getSystemAccountsSummary() {
const [localAccounts, syncedContributions, miningConfig, circulationPool] = const [
await Promise.all([ syncedSystemAccounts,
this.prisma.systemAccount.findMany(), syncedPoolAccounts,
this.prisma.syncedSystemContribution.findMany(), syncedContributions,
this.prisma.syncedMiningConfig.findFirst(), miningConfig,
this.prisma.syncedCirculationPool.findFirst(), circulationPool,
]); ] = await Promise.all([
this.prisma.syncedWalletSystemAccount.findMany(),
this.prisma.syncedWalletPoolAccount.findMany(),
this.prisma.syncedSystemContribution.findMany(),
this.prisma.syncedMiningConfig.findFirst(),
this.prisma.syncedCirculationPool.findFirst(),
]);
// 计算总算力 // 从 mining-service 获取挖矿数据汇总
let totalSystemContribution = 0n; const miningDataMap = await this.fetchMiningServiceSystemAccounts();
for (const account of localAccounts) {
totalSystemContribution += BigInt( // 计算总挖矿积分股
account.totalContribution.toString().replace('.', ''), let totalMined = 0;
); for (const miningData of miningDataMap.values()) {
totalMined += Number(miningData.totalMined || 0);
} }
// 计算总算力
let totalSyncedContribution = 0n; let totalSyncedContribution = 0n;
for (const contrib of syncedContributions) { for (const contrib of syncedContributions) {
totalSyncedContribution += BigInt( totalSyncedContribution += BigInt(
@ -88,10 +250,22 @@ export class SystemAccountsService {
return { return {
systemAccounts: { systemAccounts: {
count: localAccounts.length, count: syncedSystemAccounts.length,
totalContribution: ( totalBalance: syncedSystemAccounts.reduce(
Number(totalSystemContribution) / 100000000 (sum, acc) => sum + Number(acc.shareBalance),
0,
).toFixed(8), ).toFixed(8),
totalMined: totalMined.toFixed(8),
},
poolAccounts: {
count: syncedPoolAccounts.length,
pools: syncedPoolAccounts.map((pool) => ({
poolType: pool.poolType,
name: pool.name,
balance: pool.balance.toString(),
targetBurn: pool.targetBurn?.toString(),
remainingBurn: pool.remainingBurn?.toString(),
})),
}, },
syncedContributions: { syncedContributions: {
count: syncedContributions.length, count: syncedContributions.length,
@ -115,4 +289,243 @@ export class SystemAccountsService {
: null, : null,
}; };
} }
/**
*
* @param accountType OPERATION/PROVINCE/CITY/HEADQUARTERS
* @param regionCode / 440000, 440100
* @param page
* @param pageSize
*/
async getSystemAccountMiningRecords(
accountType: string,
regionCode: string | null,
page: number = 1,
pageSize: number = 20,
) {
const miningServiceUrl = this.configService.get<string>(
'MINING_SERVICE_URL',
'http://localhost:3021',
);
try {
const params: Record<string, any> = { page, pageSize };
if (regionCode) {
params.regionCode = regionCode;
}
const response = await firstValueFrom(
this.httpService.get(
`${miningServiceUrl}/admin/system-accounts/${accountType}/records`,
{ params },
),
);
return response.data;
} catch (error) {
this.logger.warn(
`Failed to fetch system account mining records: ${error.message}`,
);
return { records: [], total: 0, page, pageSize, accountType, regionCode };
}
}
/**
*
* @param accountType OPERATION/PROVINCE/CITY/HEADQUARTERS
* @param regionCode / 440000, 440100
* @param page
* @param pageSize
*/
async getSystemAccountTransactions(
accountType: string,
regionCode: string | null,
page: number = 1,
pageSize: number = 20,
) {
const miningServiceUrl = this.configService.get<string>(
'MINING_SERVICE_URL',
'http://localhost:3021',
);
try {
const params: Record<string, any> = { page, pageSize };
if (regionCode) {
params.regionCode = regionCode;
}
const response = await firstValueFrom(
this.httpService.get(
`${miningServiceUrl}/admin/system-accounts/${accountType}/transactions`,
{ params },
),
);
return response.data;
} catch (error) {
this.logger.warn(
`Failed to fetch system account transactions: ${error.message}`,
);
return { transactions: [], total: 0, page, pageSize, accountType, regionCode };
}
}
/**
*
*
*
* @param accountType OPERATION/PROVINCE/CITY/HEADQUARTERS
* @param regionCode / 440000, 440100
* @param page
* @param pageSize
*/
async getSystemAccountContributionRecords(
accountType: string,
regionCode: string | null,
page: number = 1,
pageSize: number = 20,
) {
// Prisma 查询 null 值需要用 { equals: null }
const whereClause = regionCode
? { accountType, regionCode }
: { accountType, regionCode: { equals: null } };
const [records, total] = await Promise.all([
this.prisma.syncedSystemContributionRecord.findMany({
where: whereClause,
skip: (page - 1) * pageSize,
take: pageSize,
orderBy: { createdAt: 'desc' },
}),
this.prisma.syncedSystemContributionRecord.count({
where: whereClause,
}),
]);
// 获取关联的认种订单和用户信息
const adoptionIds = [...new Set(records.map(r => r.sourceAdoptionId))];
const accountSequences = [...new Set(records.map(r => r.sourceAccountSequence))];
const [adoptions, users] = await Promise.all([
this.prisma.syncedAdoption.findMany({
where: { originalAdoptionId: { in: adoptionIds } },
select: {
originalAdoptionId: true,
accountSequence: true,
treeCount: true,
adoptionDate: true,
status: true,
contributionPerTree: true,
},
}),
this.prisma.syncedUser.findMany({
where: { accountSequence: { in: accountSequences } },
select: {
accountSequence: true,
phone: true,
realName: true,
nickname: true,
},
}),
]);
// 构建映射
const adoptionMap = new Map(adoptions.map(a => [a.originalAdoptionId.toString(), a]));
const userMap = new Map(users.map(u => [u.accountSequence, u]));
return {
records: records.map((record) => {
const adoption = adoptionMap.get(record.sourceAdoptionId.toString());
const user = userMap.get(record.sourceAccountSequence);
return {
originalRecordId: record.originalRecordId.toString(),
accountType: record.accountType,
regionCode: record.regionCode,
sourceAdoptionId: record.sourceAdoptionId.toString(),
sourceAccountSequence: record.sourceAccountSequence,
// 来源类型
sourceType: record.sourceType,
levelDepth: record.levelDepth,
// 认种订单详情
adoptionTreeCount: adoption?.treeCount || 0,
adoptionDate: adoption?.adoptionDate || null,
adoptionStatus: adoption?.status || null,
contributionPerTree: adoption?.contributionPerTree?.toString() || '0',
// 用户信息
sourceUserPhone: user?.phone ? this.maskPhone(user.phone) : null,
sourceUserName: user?.realName || user?.nickname || null,
// 分配信息
distributionRate: record.distributionRate.toString(),
amount: record.amount.toString(),
effectiveDate: record.effectiveDate,
expireDate: record.expireDate,
isExpired: record.isExpired,
createdAt: record.createdAt,
syncedAt: record.syncedAt,
};
}),
total,
page,
pageSize,
totalPages: Math.ceil(total / pageSize),
};
}
/**
*
*/
private maskPhone(phone: string): string {
if (!phone || phone.length < 7) return phone;
return phone.substring(0, 3) + '****' + phone.substring(phone.length - 4);
}
/**
*
*
*/
async getSystemAccountContributionStats(accountType: string, regionCode: string | null) {
// 获取算力账户信息
// 使用 findFirst 替代 findUnique因为 regionCode 可以为 null
const contribution = await this.prisma.syncedSystemContribution.findFirst({
where: {
accountType,
regionCode: regionCode === null ? { equals: null } : regionCode,
},
});
const whereClause = regionCode
? { accountType, regionCode }
: { accountType, regionCode: { equals: null } };
// 获取明细记录统计
const recordStats = await this.prisma.syncedSystemContributionRecord.aggregate({
where: whereClause,
_count: true,
_sum: { amount: true },
});
// 获取来源认种订单数量(去重)
const uniqueAdoptions = await this.prisma.syncedSystemContributionRecord.groupBy({
by: ['sourceAdoptionId'],
where: whereClause,
});
// 获取来源用户数量(去重)
const uniqueUsers = await this.prisma.syncedSystemContributionRecord.groupBy({
by: ['sourceAccountSequence'],
where: whereClause,
});
return {
accountType,
regionCode,
name: contribution?.name || accountType,
totalContribution: contribution?.contributionBalance?.toString() || '0',
recordCount: recordStats._count,
sumFromRecords: recordStats._sum?.amount?.toString() || '0',
uniqueAdoptionCount: uniqueAdoptions.length,
uniqueUserCount: uniqueUsers.length,
};
}
} }

View File

@ -1,4 +1,5 @@
import { Injectable, NotFoundException } from '@nestjs/common'; import { Injectable, NotFoundException, Logger } from '@nestjs/common';
import { ConfigService } from '@nestjs/config';
import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service'; import { PrismaService } from '../../infrastructure/persistence/prisma/prisma.service';
import { Prisma } from '@prisma/client'; import { Prisma } from '@prisma/client';
@ -20,7 +21,15 @@ export interface GetOrdersQuery {
@Injectable() @Injectable()
export class UsersService { export class UsersService {
constructor(private readonly prisma: PrismaService) {} private readonly logger = new Logger(UsersService.name);
private readonly miningServiceUrl: string;
constructor(
private readonly prisma: PrismaService,
private readonly configService: ConfigService,
) {
this.miningServiceUrl = this.configService.get<string>('MINING_SERVICE_URL', 'http://localhost:3021');
}
/** /**
* *
@ -103,32 +112,38 @@ export class UsersService {
*/ */
private async getAdoptionStatsForUsers( private async getAdoptionStatsForUsers(
accountSequences: string[], accountSequences: string[],
): Promise<Map<string, { personalCount: number; teamCount: number }>> { ): Promise<Map<string, { personalCount: number; personalOrders: number; teamCount: number; teamOrders: number }>> {
const result = new Map< const result = new Map<
string, string,
{ personalCount: number; teamCount: number } { personalCount: number; personalOrders: number; teamCount: number; teamOrders: number }
>(); >();
if (accountSequences.length === 0) return result; if (accountSequences.length === 0) return result;
// 获取每个用户的个人认种数量 // 获取每个用户的个人认种数量和订单数(只统计 MINING_ENABLED 状态)
const personalAdoptions = await this.prisma.syncedAdoption.groupBy({ const personalAdoptions = await this.prisma.syncedAdoption.groupBy({
by: ['accountSequence'], by: ['accountSequence'],
where: { accountSequence: { in: accountSequences } }, where: {
accountSequence: { in: accountSequences },
status: 'MINING_ENABLED',
},
_sum: { treeCount: true }, _sum: { treeCount: true },
_count: { id: true },
}); });
for (const stat of personalAdoptions) { for (const stat of personalAdoptions) {
result.set(stat.accountSequence, { result.set(stat.accountSequence, {
personalCount: stat._sum.treeCount || 0, personalCount: stat._sum.treeCount || 0,
personalOrders: stat._count.id || 0,
teamCount: 0, teamCount: 0,
teamOrders: 0,
}); });
} }
// 确保所有用户都有记录 // 确保所有用户都有记录
for (const seq of accountSequences) { for (const seq of accountSequences) {
if (!result.has(seq)) { if (!result.has(seq)) {
result.set(seq, { personalCount: 0, teamCount: 0 }); result.set(seq, { personalCount: 0, personalOrders: 0, teamCount: 0, teamOrders: 0 });
} }
} }
@ -153,12 +168,15 @@ export class UsersService {
const teamAdoptionStats = await this.prisma.syncedAdoption.aggregate({ const teamAdoptionStats = await this.prisma.syncedAdoption.aggregate({
where: { where: {
accountSequence: { in: teamMembers.map((m) => m.accountSequence) }, accountSequence: { in: teamMembers.map((m) => m.accountSequence) },
status: 'MINING_ENABLED',
}, },
_sum: { treeCount: true }, _sum: { treeCount: true },
_count: { id: true },
}); });
const stats = result.get(ref.accountSequence); const stats = result.get(ref.accountSequence);
if (stats) { if (stats) {
stats.teamCount = teamAdoptionStats._sum.treeCount || 0; stats.teamCount = teamAdoptionStats._sum.treeCount || 0;
stats.teamOrders = teamAdoptionStats._count.id || 0;
} }
} }
} }
@ -212,9 +230,9 @@ export class UsersService {
throw new NotFoundException(`用户 ${accountSequence} 不存在`); throw new NotFoundException(`用户 ${accountSequence} 不存在`);
} }
// 获取个人认种数量(从 synced_adoptions 统计 // 获取个人认种数量(从 synced_adoptions 统计,只统计 MINING_ENABLED 状态
const personalAdoptionStats = await this.prisma.syncedAdoption.aggregate({ const personalAdoptionStats = await this.prisma.syncedAdoption.aggregate({
where: { accountSequence }, where: { accountSequence, status: 'MINING_ENABLED' },
_sum: { treeCount: true }, _sum: { treeCount: true },
_count: { id: true }, _count: { id: true },
}); });
@ -226,7 +244,7 @@ export class UsersService {
}); });
const directReferralCount = directReferrals.length; const directReferralCount = directReferrals.length;
// 获取直推认种数量 // 获取直推认种数量(只统计 MINING_ENABLED 状态)
let directReferralAdoptions = 0; let directReferralAdoptions = 0;
if (directReferrals.length > 0) { if (directReferrals.length > 0) {
const directAdoptionStats = await this.prisma.syncedAdoption.aggregate({ const directAdoptionStats = await this.prisma.syncedAdoption.aggregate({
@ -234,6 +252,7 @@ export class UsersService {
accountSequence: { accountSequence: {
in: directReferrals.map((r) => r.accountSequence), in: directReferrals.map((r) => r.accountSequence),
}, },
status: 'MINING_ENABLED',
}, },
_sum: { treeCount: true }, _sum: { treeCount: true },
}); });
@ -267,6 +286,7 @@ export class UsersService {
accountSequence: { accountSequence: {
in: teamMembers.map((m) => m.accountSequence), in: teamMembers.map((m) => m.accountSequence),
}, },
status: 'MINING_ENABLED',
}, },
_sum: { treeCount: true }, _sum: { treeCount: true },
}); });
@ -412,8 +432,7 @@ export class UsersService {
} }
/** /**
* * mining-service
* mining-service
*/ */
async getUserMiningRecords( async getUserMiningRecords(
accountSequence: string, accountSequence: string,
@ -430,33 +449,79 @@ export class UsersService {
} }
const mining = user.miningAccount; const mining = user.miningAccount;
if (!mining) { const emptySummary = {
accountSequence,
totalMined: '0',
availableBalance: '0',
frozenBalance: '0',
totalContribution: '0',
};
// 从 mining-service 获取挖矿记录
try {
const url = `${this.miningServiceUrl}/api/v2/mining/accounts/${accountSequence}/records?page=${page}&pageSize=${pageSize}`;
this.logger.log(`Fetching mining records from ${url}`);
const response = await fetch(url);
if (!response.ok) {
this.logger.warn(`Failed to fetch mining records: ${response.status}`);
return {
summary: mining ? {
accountSequence,
totalMined: mining.totalMined.toString(),
availableBalance: mining.availableBalance.toString(),
frozenBalance: mining.frozenBalance.toString(),
totalContribution: mining.totalContribution.toString(),
} : emptySummary,
records: [],
pagination: { page, pageSize, total: 0, totalPages: 0 },
};
}
const result = await response.json();
const recordsData = result.data || result;
// 格式化记录以匹配前端期望的格式
const records = (recordsData.data || []).map((r: any) => ({
id: r.id,
accountSequence,
distributionMinute: r.miningMinute,
contributionRatio: r.contributionRatio,
shareAmount: r.minedAmount,
priceSnapshot: r.secondDistribution,
createdAt: r.createdAt,
}));
return { return {
summary: { summary: mining ? {
accountSequence, accountSequence,
totalMined: '0', totalMined: mining.totalMined.toString(),
availableBalance: '0', availableBalance: mining.availableBalance.toString(),
frozenBalance: '0', frozenBalance: mining.frozenBalance.toString(),
totalContribution: '0', totalContribution: mining.totalContribution.toString(),
} : emptySummary,
records,
pagination: {
page,
pageSize,
total: recordsData.total || 0,
totalPages: Math.ceil((recordsData.total || 0) / pageSize),
}, },
};
} catch (error) {
this.logger.error('Failed to fetch mining records from mining-service', error);
return {
summary: mining ? {
accountSequence,
totalMined: mining.totalMined.toString(),
availableBalance: mining.availableBalance.toString(),
frozenBalance: mining.frozenBalance.toString(),
totalContribution: mining.totalContribution.toString(),
} : emptySummary,
records: [], records: [],
pagination: { page, pageSize, total: 0, totalPages: 0 }, pagination: { page, pageSize, total: 0, totalPages: 0 },
}; };
} }
return {
summary: {
accountSequence,
totalMined: mining.totalMined.toString(),
availableBalance: mining.availableBalance.toString(),
frozenBalance: mining.frozenBalance.toString(),
totalContribution: mining.totalContribution.toString(),
},
// 详细流水需要从 mining-service 获取
records: [],
pagination: { page, pageSize, total: 0, totalPages: 0 },
note: '详细挖矿记录请查看 mining-service',
};
} }
/** /**
@ -568,14 +633,14 @@ export class UsersService {
} }
/** /**
* * MINING_ENABLED
*/ */
private async getUserAdoptionStats( private async getUserAdoptionStats(
accountSequence: string, accountSequence: string,
): Promise<{ personal: number; team: number }> { ): Promise<{ personal: number; team: number }> {
// 个人认种 // 个人认种(只统计 MINING_ENABLED 状态)
const personalStats = await this.prisma.syncedAdoption.aggregate({ const personalStats = await this.prisma.syncedAdoption.aggregate({
where: { accountSequence }, where: { accountSequence, status: 'MINING_ENABLED' },
_sum: { treeCount: true }, _sum: { treeCount: true },
}); });
@ -587,7 +652,7 @@ export class UsersService {
let teamCount = 0; let teamCount = 0;
if (referral?.originalUserId) { if (referral?.originalUserId) {
// 团队认种 = 所有下级的认种总和 // 团队认种 = 所有下级的认种总和(只统计 MINING_ENABLED 状态)
const teamMembers = await this.prisma.syncedReferral.findMany({ const teamMembers = await this.prisma.syncedReferral.findMany({
where: { where: {
ancestorPath: { contains: referral.originalUserId.toString() }, ancestorPath: { contains: referral.originalUserId.toString() },
@ -599,6 +664,7 @@ export class UsersService {
const teamStats = await this.prisma.syncedAdoption.aggregate({ const teamStats = await this.prisma.syncedAdoption.aggregate({
where: { where: {
accountSequence: { in: teamMembers.map((m) => m.accountSequence) }, accountSequence: { in: teamMembers.map((m) => m.accountSequence) },
status: 'MINING_ENABLED',
}, },
_sum: { treeCount: true }, _sum: { treeCount: true },
}); });
@ -840,7 +906,7 @@ export class UsersService {
/** /**
* *
* TODO: mining-service * SyncedUserWallet SyncedMiningAccount
*/ */
async getWalletLedger(accountSequence: string, page: number, pageSize: number) { async getWalletLedger(accountSequence: string, page: number, pageSize: number) {
const user = await this.prisma.syncedUser.findUnique({ const user = await this.prisma.syncedUser.findUnique({
@ -852,20 +918,44 @@ export class UsersService {
throw new NotFoundException(`用户 ${accountSequence} 不存在`); throw new NotFoundException(`用户 ${accountSequence} 不存在`);
} }
// 获取用户的各类钱包数据
const wallets = await this.prisma.syncedUserWallet.findMany({
where: { accountSequence },
});
// 按钱包类型分类
const walletByType = new Map(wallets.map(w => [w.walletType, w]));
const greenPointsWallet = walletByType.get('GREEN_POINTS');
const contributionWallet = walletByType.get('CONTRIBUTION');
const tokenWallet = walletByType.get('TOKEN_STORAGE');
const mining = user.miningAccount; const mining = user.miningAccount;
// 构建前端期望的钱包汇总格式
// usdtAvailable = GREEN_POINTS 钱包的可用余额 (绿积分)
// usdtFrozen = GREEN_POINTS 钱包的冻结余额
// pendingUsdt = 待领取收益(挖矿余额)
// settleableUsdt = 可结算收益
// settledTotalUsdt = 已结算收益
// expiredTotalUsdt = 过期收益
const summary = {
usdtAvailable: greenPointsWallet?.balance?.toString() || '0',
usdtFrozen: greenPointsWallet?.frozenBalance?.toString() || '0',
pendingUsdt: mining?.availableBalance?.toString() || '0', // 挖矿可用余额作为待领取
settleableUsdt: '0', // 暂无数据源
settledTotalUsdt: greenPointsWallet?.totalInflow?.toString() || '0', // 总流入作为已结算
expiredTotalUsdt: '0', // 暂无数据源
};
// TODO: 实现钱包流水分页查询
// 目前从 SyncedUserWallet 只能获取汇总数据,流水明细需要额外的表
return { return {
summary: { summary,
availableBalance: mining?.availableBalance?.toString() || '0',
frozenBalance: mining?.frozenBalance?.toString() || '0',
totalMined: mining?.totalMined?.toString() || '0',
},
items: [], items: [],
total: 0, total: 0,
page, page,
pageSize, pageSize,
totalPages: 0, totalPages: 0,
note: '钱包流水数据需要从 mining-service 同步',
}; };
} }
@ -876,7 +966,7 @@ export class UsersService {
private formatUserListItem( private formatUserListItem(
user: any, user: any,
extra?: { extra?: {
adoptionStats?: { personalCount: number; teamCount: number }; adoptionStats?: { personalCount: number; personalOrders: number; teamCount: number; teamOrders: number };
referrerInfo?: { nickname: string | null; phone: string } | null; referrerInfo?: { nickname: string | null; phone: string } | null;
}, },
) { ) {
@ -892,7 +982,9 @@ export class UsersService {
// 认种统计 // 认种统计
adoption: { adoption: {
personalAdoptionCount: extra?.adoptionStats?.personalCount || 0, personalAdoptionCount: extra?.adoptionStats?.personalCount || 0,
personalAdoptionOrders: extra?.adoptionStats?.personalOrders || 0,
teamAdoptions: extra?.adoptionStats?.teamCount || 0, teamAdoptions: extra?.adoptionStats?.teamCount || 0,
teamAdoptionOrders: extra?.adoptionStats?.teamOrders || 0,
}, },
// 推荐人信息 // 推荐人信息
referral: user.referral referral: user.referral

View File

@ -1,12 +1,20 @@
import { Module, Global } from '@nestjs/common'; import { Module, Global } from '@nestjs/common';
import { ConfigModule, ConfigService } from '@nestjs/config'; import { ConfigModule, ConfigService } from '@nestjs/config';
import { HttpModule } from '@nestjs/axios';
import { PrismaModule } from './persistence/prisma/prisma.module'; import { PrismaModule } from './persistence/prisma/prisma.module';
import { RedisService } from './redis/redis.service'; import { RedisService } from './redis/redis.service';
import { KafkaModule } from './kafka/kafka.module'; import { KafkaModule } from './kafka/kafka.module';
@Global() @Global()
@Module({ @Module({
imports: [PrismaModule, KafkaModule], imports: [
PrismaModule,
KafkaModule,
HttpModule.register({
timeout: 10000,
maxRedirects: 5,
}),
],
providers: [ providers: [
{ {
provide: 'REDIS_OPTIONS', provide: 'REDIS_OPTIONS',
@ -20,6 +28,6 @@ import { KafkaModule } from './kafka/kafka.module';
}, },
RedisService, RedisService,
], ],
exports: [PrismaModule, RedisService, KafkaModule], exports: [PrismaModule, RedisService, KafkaModule, HttpModule],
}) })
export class InfrastructureModule {} export class InfrastructureModule {}

View File

@ -317,9 +317,17 @@ export class CdcConsumerService implements OnModuleInit, OnModuleDestroy {
* Debezium outbox 线 * Debezium outbox 线
*/ */
private normalizeServiceEvent(data: any): Omit<ServiceEvent, 'sequenceNum' | 'sourceTopic'> { private normalizeServiceEvent(data: any): Omit<ServiceEvent, 'sequenceNum' | 'sourceTopic'> {
// 如果已经是驼峰格式,直接返回 // 如果已经是驼峰格式mining-wallet-service 直接发布的事件)
// 注意mining-wallet-service 使用 eventId 而不是 id
if (data.eventType && data.aggregateType) { if (data.eventType && data.aggregateType) {
return data; return {
id: data.id ?? data.eventId,
eventType: data.eventType,
aggregateType: data.aggregateType,
aggregateId: data.aggregateId,
payload: data.payload,
createdAt: data.createdAt,
};
} }
// Debezium outbox 格式转换 // Debezium outbox 格式转换

View File

@ -158,6 +158,16 @@ export class CdcSyncService implements OnModuleInit {
'SystemContributionUpdated', 'SystemContributionUpdated',
this.withIdempotency(this.handleSystemContributionUpdated.bind(this)), this.withIdempotency(this.handleSystemContributionUpdated.bind(this)),
); );
// SystemAccountSynced 事件 - 同步系统账户算力(来自 contribution-service
this.cdcConsumer.registerServiceHandler(
'SystemAccountSynced',
this.withIdempotency(this.handleSystemAccountSynced.bind(this)),
);
// SystemContributionRecordCreated 事件 - 同步系统账户算力明细(来自 contribution-service
this.cdcConsumer.registerServiceHandler(
'SystemContributionRecordCreated',
this.withIdempotency(this.handleSystemContributionRecordCreated.bind(this)),
);
// ReferralSynced 事件 - 同步推荐关系 // ReferralSynced 事件 - 同步推荐关系
this.cdcConsumer.registerServiceHandler( this.cdcConsumer.registerServiceHandler(
'ReferralSynced', 'ReferralSynced',
@ -353,6 +363,12 @@ export class CdcSyncService implements OnModuleInit {
this.withIdempotency(this.walletHandlers.handleFeeConfigUpdated.bind(this.walletHandlers)), this.withIdempotency(this.walletHandlers.handleFeeConfigUpdated.bind(this.walletHandlers)),
); );
// CONTRIBUTION_CREDITED 事件 - 贡献值入账时更新用户钱包
this.cdcConsumer.registerServiceHandler(
'CONTRIBUTION_CREDITED',
this.withIdempotency(this.handleContributionCredited.bind(this)),
);
this.logger.log('CDC sync handlers registered with idempotency protection'); this.logger.log('CDC sync handlers registered with idempotency protection');
} }
@ -524,20 +540,165 @@ export class CdcSyncService implements OnModuleInit {
private async handleSystemContributionUpdated(event: ServiceEvent, tx: TransactionClient): Promise<void> { private async handleSystemContributionUpdated(event: ServiceEvent, tx: TransactionClient): Promise<void> {
const { payload } = event; const { payload } = event;
await tx.syncedSystemContribution.upsert({ const accountType = payload.accountType;
where: { accountType: payload.accountType }, const regionCode = payload.regionCode || null;
// 查找所有匹配的记录(处理可能存在的重复记录情况)
// 注意:由于 PostgreSQL 中 NULL != NULL唯一约束在 regionCode 为 NULL 时不生效
const existingRecords = await tx.syncedSystemContribution.findMany({
where: {
accountType,
regionCode: regionCode === null ? { equals: null } : regionCode,
},
orderBy: { syncedAt: 'asc' },
});
if (existingRecords.length > 0) {
await tx.syncedSystemContribution.update({
where: { id: existingRecords[0].id },
data: {
name: payload.name,
contributionBalance: payload.contributionBalance,
contributionNeverExpires: payload.contributionNeverExpires,
},
});
// 删除重复记录
if (existingRecords.length > 1) {
const duplicateIds = existingRecords.slice(1).map(r => r.id);
await tx.syncedSystemContribution.deleteMany({
where: { id: { in: duplicateIds } },
});
this.logger.warn(
`Deleted ${duplicateIds.length} duplicate system contribution records for ${accountType}:${regionCode}`,
);
}
} else {
await tx.syncedSystemContribution.create({
data: {
accountType,
regionCode,
name: payload.name,
contributionBalance: payload.contributionBalance || 0,
contributionNeverExpires: payload.contributionNeverExpires || false,
},
});
}
}
/**
* SystemAccountSynced -
* contribution-service
* accountType: OPERATION / PROVINCE / CITY / HEADQUARTERS
* regionCode: / 440000, 440100
*
* PostgreSQL NULL != NULL@@unique([accountType, regionCode])
* regionCode NULL
*/
private async handleSystemAccountSynced(event: ServiceEvent, tx: TransactionClient): Promise<void> {
const { payload } = event;
const accountType = payload.accountType; // OPERATION / PROVINCE / CITY / HEADQUARTERS
const regionCode = payload.regionCode || null;
// 查找所有匹配的记录(处理可能存在的重复记录情况)
const existingRecords = await tx.syncedSystemContribution.findMany({
where: {
accountType,
regionCode: regionCode === null ? { equals: null } : regionCode,
},
orderBy: { syncedAt: 'asc' }, // 保留最早创建的记录
});
if (existingRecords.length > 0) {
// 更新第一条记录
await tx.syncedSystemContribution.update({
where: { id: existingRecords[0].id },
data: {
name: payload.name,
contributionBalance: payload.contributionBalance,
},
});
// 如果存在重复记录,删除多余的(只保留第一条)
if (existingRecords.length > 1) {
const duplicateIds = existingRecords.slice(1).map(r => r.id);
await tx.syncedSystemContribution.deleteMany({
where: { id: { in: duplicateIds } },
});
this.logger.warn(
`Deleted ${duplicateIds.length} duplicate system contribution records for ${accountType}:${regionCode}`,
);
}
} else {
await tx.syncedSystemContribution.create({
data: {
accountType,
regionCode,
name: payload.name,
contributionBalance: payload.contributionBalance || 0,
contributionNeverExpires: true, // 系统账户算力永不过期
},
});
}
}
/**
* SystemContributionRecordCreated -
* contribution-service
*/
private async handleSystemContributionRecordCreated(event: ServiceEvent, tx: TransactionClient): Promise<void> {
const { payload } = event;
// contribution-service 使用 systemAccountType 字段,需要兼容处理
const systemAccountType = payload.systemAccountType || payload.accountType;
// 解析 systemAccountType可能是 "PROVINCE_440000" 或 "PROVINCE"
let accountType: string;
let regionCode: string | null = null;
if (systemAccountType?.includes('_')) {
const parts = systemAccountType.split('_');
accountType = parts[0];
regionCode = parts.slice(1).join('_');
} else {
accountType = systemAccountType;
regionCode = payload.regionCode || null;
}
await tx.syncedSystemContributionRecord.upsert({
where: { originalRecordId: BigInt(payload.recordId) },
create: { create: {
accountType: payload.accountType, originalRecordId: BigInt(payload.recordId),
name: payload.name, accountType,
contributionBalance: payload.contributionBalance || 0, regionCode,
contributionNeverExpires: payload.contributionNeverExpires || false, sourceAdoptionId: BigInt(payload.sourceAdoptionId),
sourceAccountSequence: payload.sourceAccountSequence,
sourceType: payload.sourceType || 'FIXED_RATE', // 来源类型
levelDepth: payload.levelDepth ?? null, // 层级深度
distributionRate: payload.distributionRate,
amount: payload.amount,
effectiveDate: new Date(payload.effectiveDate),
expireDate: payload.expireDate ? new Date(payload.expireDate) : null,
isExpired: false,
createdAt: new Date(payload.createdAt),
}, },
update: { update: {
name: payload.name, accountType,
contributionBalance: payload.contributionBalance, regionCode,
contributionNeverExpires: payload.contributionNeverExpires, sourceAdoptionId: BigInt(payload.sourceAdoptionId),
sourceAccountSequence: payload.sourceAccountSequence,
sourceType: payload.sourceType || 'FIXED_RATE',
levelDepth: payload.levelDepth ?? null,
distributionRate: payload.distributionRate,
amount: payload.amount,
effectiveDate: new Date(payload.effectiveDate),
expireDate: payload.expireDate ? new Date(payload.expireDate) : null,
}, },
}); });
this.logger.debug(
`Synced system contribution record: recordId=${payload.recordId}, account=${accountType}:${regionCode}, amount=${payload.amount}`,
);
} }
/** /**
@ -696,6 +857,9 @@ export class CdcSyncService implements OnModuleInit {
const { payload } = event; const { payload } = event;
// 只保留一条挖矿配置记录 // 只保留一条挖矿配置记录
await tx.syncedMiningConfig.deleteMany({}); await tx.syncedMiningConfig.deleteMany({});
// mining-service 发布 secondDistribution计算 minuteDistribution = secondDistribution * 60
const secondDistribution = parseFloat(payload.secondDistribution || '0');
const minuteDistribution = payload.minuteDistribution || (secondDistribution * 60).toString();
await tx.syncedMiningConfig.create({ await tx.syncedMiningConfig.create({
data: { data: {
totalShares: payload.totalShares, totalShares: payload.totalShares,
@ -703,7 +867,7 @@ export class CdcSyncService implements OnModuleInit {
remainingDistribution: payload.remainingDistribution, remainingDistribution: payload.remainingDistribution,
halvingPeriodYears: payload.halvingPeriodYears, halvingPeriodYears: payload.halvingPeriodYears,
currentEra: payload.currentEra || 1, currentEra: payload.currentEra || 1,
minuteDistribution: payload.minuteDistribution, minuteDistribution: minuteDistribution,
isActive: payload.isActive || false, isActive: payload.isActive || false,
activatedAt: payload.activatedAt ? new Date(payload.activatedAt) : null, activatedAt: payload.activatedAt ? new Date(payload.activatedAt) : null,
}, },
@ -813,4 +977,60 @@ export class CdcSyncService implements OnModuleInit {
this.logger.debug('Synced circulation pool'); this.logger.debug('Synced circulation pool');
} }
// ===========================================================================
// 钱包事件处理 (mining-wallet-service)
// ===========================================================================
/**
* CONTRIBUTION_CREDITED
* mining-wallet-service
* payload: { accountSequence, walletType, amount, balanceAfter, transactionId, ... }
*/
private async handleContributionCredited(event: ServiceEvent, tx: TransactionClient): Promise<void> {
const { payload } = event;
const walletType = payload.walletType || 'CONTRIBUTION';
// 先查找是否已存在
const existing = await tx.syncedUserWallet.findUnique({
where: {
accountSequence_walletType: {
accountSequence: payload.accountSequence,
walletType,
},
},
});
if (existing) {
// 更新余额(使用最新的 balanceAfter
await tx.syncedUserWallet.update({
where: { id: existing.id },
data: {
balance: payload.balanceAfter,
totalInflow: {
increment: parseFloat(payload.amount) || 0,
},
},
});
} else {
// 创建新钱包记录
// originalId 使用 accountSequence + walletType 的组合生成一个稳定的 ID
const originalId = `wallet-${payload.accountSequence}-${walletType}`;
await tx.syncedUserWallet.create({
data: {
originalId,
accountSequence: payload.accountSequence,
walletType,
balance: payload.balanceAfter || 0,
frozenBalance: 0,
totalInflow: parseFloat(payload.amount) || 0,
totalOutflow: 0,
isActive: true,
},
});
}
this.logger.debug(`Synced user wallet from CONTRIBUTION_CREDITED: ${payload.accountSequence}, balance: ${payload.balanceAfter}`);
}
} }

View File

@ -21,8 +21,8 @@ KAFKA_GROUP_ID=mining-service-group
JWT_SECRET=your-jwt-secret-key JWT_SECRET=your-jwt-secret-key
# Mining Configuration # Mining Configuration
TOTAL_SHARES=100020000000 TOTAL_SHARES=10002000000
DISTRIBUTION_POOL=200000000 DISTRIBUTION_POOL=2000000
INITIAL_PRICE=1 INITIAL_PRICE=1
HALVING_PERIOD_YEARS=2 HALVING_PERIOD_YEARS=2
BURN_TARGET=10000000000 BURN_TARGET=10000000000

View File

@ -14,7 +14,7 @@ RUN npm ci
RUN DATABASE_URL="postgresql://user:pass@localhost:5432/db" npx prisma generate RUN DATABASE_URL="postgresql://user:pass@localhost:5432/db" npx prisma generate
COPY src ./src COPY src ./src
RUN npm run build RUN npm run build && ls -la dist/ && test -f dist/main.js && echo "Build successful: dist/main.js exists"
# 阶段2: 生产运行 # 阶段2: 生产运行
FROM node:20-alpine AS runner FROM node:20-alpine AS runner
@ -30,14 +30,16 @@ WORKDIR /app
USER nestjs USER nestjs
COPY --chown=nestjs:nodejs package*.json ./ COPY --chown=nestjs:nodejs package*.json ./
RUN npm ci --only=production && npm cache clean --force COPY --chown=nestjs:nodejs tsconfig*.json ./
RUN npm ci --only=production && npm install ts-node typescript @types/node --save-dev && npm cache clean --force
COPY --chown=nestjs:nodejs prisma ./prisma/ COPY --chown=nestjs:nodejs prisma ./prisma/
RUN DATABASE_URL="postgresql://user:pass@localhost:5432/db" npx prisma generate RUN DATABASE_URL="postgresql://user:pass@localhost:5432/db" npx prisma generate
COPY --chown=nestjs:nodejs --from=builder /app/dist ./dist COPY --chown=nestjs:nodejs --from=builder /app/dist ./dist
RUN ls -la dist/ && test -f dist/main.js && echo "Copy successful: dist/main.js exists"
RUN printf '#!/bin/sh\nset -e\necho "Running database migrations..."\nnpx prisma migrate deploy\necho "Starting application..."\nexec node dist/main.js\n' > /app/start.sh && chmod +x /app/start.sh RUN printf '#!/bin/sh\nset -e\necho "Running database migrations..."\nnpx prisma migrate deploy\necho "Running database seed..."\nnpx prisma db seed || echo "Seed skipped or already applied"\necho "Starting application..."\nexec node dist/main.js\n' > /app/start.sh && chmod +x /app/start.sh
ENV NODE_ENV=production ENV NODE_ENV=production
ENV TZ=Asia/Shanghai ENV TZ=Asia/Shanghai

View File

@ -16,7 +16,8 @@
"prisma:generate": "prisma generate", "prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate dev", "prisma:migrate": "prisma migrate dev",
"prisma:migrate:prod": "prisma migrate deploy", "prisma:migrate:prod": "prisma migrate deploy",
"prisma:studio": "prisma studio" "prisma:studio": "prisma studio",
"prisma:seed": "ts-node prisma/seed.ts"
}, },
"dependencies": { "dependencies": {
"@nestjs/common": "^10.3.0", "@nestjs/common": "^10.3.0",
@ -37,6 +38,9 @@
"rxjs": "^7.8.1", "rxjs": "^7.8.1",
"swagger-ui-express": "^5.0.0" "swagger-ui-express": "^5.0.0"
}, },
"prisma": {
"seed": "ts-node prisma/seed.ts"
},
"devDependencies": { "devDependencies": {
"@nestjs/cli": "^10.2.1", "@nestjs/cli": "^10.2.1",
"@nestjs/schematics": "^10.0.3", "@nestjs/schematics": "^10.0.3",

View File

@ -1,6 +1,7 @@
-- ============================================================================ -- ============================================================================
-- mining-service 初始化 migration -- mining-service 初始化 migration
-- 合并自: 20260111000000_init (只有一个,无需合并) -- 合并自: 0001_init, 0002_minute_to_second, 0003_add_system_accounts_and_pending_mining,
-- 20250120000001_add_region_to_system_mining_accounts
-- ============================================================================ -- ============================================================================
-- CreateEnum -- CreateEnum
@ -21,7 +22,11 @@ CREATE TABLE "mining_configs" (
"halvingPeriodYears" INTEGER NOT NULL DEFAULT 2, "halvingPeriodYears" INTEGER NOT NULL DEFAULT 2,
"currentEra" INTEGER NOT NULL DEFAULT 1, "currentEra" INTEGER NOT NULL DEFAULT 1,
"eraStartDate" TIMESTAMP(3) NOT NULL, "eraStartDate" TIMESTAMP(3) NOT NULL,
"minuteDistribution" DECIMAL(30,18) NOT NULL, "secondDistribution" DECIMAL(30,18) NOT NULL,
"network_total_contribution" DECIMAL(30, 8) NOT NULL DEFAULT 0,
"total_tree_count" INTEGER NOT NULL DEFAULT 0,
"contribution_per_tree" DECIMAL(20, 10) NOT NULL DEFAULT 22617,
"network_last_synced_at" TIMESTAMP(3),
"isActive" BOOLEAN NOT NULL DEFAULT false, "isActive" BOOLEAN NOT NULL DEFAULT false,
"activatedAt" TIMESTAMP(3), "activatedAt" TIMESTAMP(3),
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP, "createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
@ -38,7 +43,7 @@ CREATE TABLE "mining_eras" (
"endDate" TIMESTAMP(3), "endDate" TIMESTAMP(3),
"initialDistribution" DECIMAL(30,8) NOT NULL, "initialDistribution" DECIMAL(30,8) NOT NULL,
"totalDistributed" DECIMAL(30,8) NOT NULL DEFAULT 0, "totalDistributed" DECIMAL(30,8) NOT NULL DEFAULT 0,
"minuteDistribution" DECIMAL(30,18) NOT NULL, "secondDistribution" DECIMAL(30,18) NOT NULL,
"isActive" BOOLEAN NOT NULL DEFAULT true, "isActive" BOOLEAN NOT NULL DEFAULT true,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP, "createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
@ -67,7 +72,7 @@ CREATE TABLE "mining_records" (
"miningMinute" TIMESTAMP(3) NOT NULL, "miningMinute" TIMESTAMP(3) NOT NULL,
"contributionRatio" DECIMAL(30,18) NOT NULL, "contributionRatio" DECIMAL(30,18) NOT NULL,
"totalContribution" DECIMAL(30,8) NOT NULL, "totalContribution" DECIMAL(30,8) NOT NULL,
"minuteDistribution" DECIMAL(30,18) NOT NULL, "secondDistribution" DECIMAL(30,18) NOT NULL,
"minedAmount" DECIMAL(30,18) NOT NULL, "minedAmount" DECIMAL(30,18) NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP, "createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
@ -94,6 +99,90 @@ CREATE TABLE "mining_transactions" (
CONSTRAINT "mining_transactions_pkey" PRIMARY KEY ("id") CONSTRAINT "mining_transactions_pkey" PRIMARY KEY ("id")
); );
-- CreateTable: 系统挖矿账户
CREATE TABLE "system_mining_accounts" (
"id" TEXT NOT NULL,
"account_type" TEXT NOT NULL,
"region_code" TEXT,
"name" TEXT NOT NULL,
"totalMined" DECIMAL(30, 8) NOT NULL DEFAULT 0,
"availableBalance" DECIMAL(30, 8) NOT NULL DEFAULT 0,
"totalContribution" DECIMAL(30, 8) NOT NULL DEFAULT 0,
"last_synced_at" TIMESTAMP(3),
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL,
CONSTRAINT "system_mining_accounts_pkey" PRIMARY KEY ("id")
);
-- CreateTable: 系统账户挖矿记录
CREATE TABLE "system_mining_records" (
"id" TEXT NOT NULL,
"system_account_id" TEXT NOT NULL,
"mining_minute" TIMESTAMP(3) NOT NULL,
"contribution_ratio" DECIMAL(30, 18) NOT NULL,
"total_contribution" DECIMAL(30, 8) NOT NULL,
"second_distribution" DECIMAL(30, 18) NOT NULL,
"mined_amount" DECIMAL(30, 18) NOT NULL,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "system_mining_records_pkey" PRIMARY KEY ("id")
);
-- CreateTable: 系统账户交易流水
CREATE TABLE "system_mining_transactions" (
"id" TEXT NOT NULL,
"system_account_id" TEXT NOT NULL,
"type" TEXT NOT NULL,
"amount" DECIMAL(30, 8) NOT NULL,
"balance_before" DECIMAL(30, 8) NOT NULL,
"balance_after" DECIMAL(30, 8) NOT NULL,
"reference_id" TEXT,
"reference_type" TEXT,
"memo" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "system_mining_transactions_pkey" PRIMARY KEY ("id")
);
-- CreateTable: 待解锁算力挖矿
CREATE TABLE "pending_contribution_mining" (
"id" BIGSERIAL NOT NULL,
"source_adoption_id" BIGINT NOT NULL,
"source_account_sequence" VARCHAR(20) NOT NULL,
"would_be_account_sequence" VARCHAR(20),
"contribution_type" VARCHAR(30) NOT NULL,
"amount" DECIMAL(30, 10) NOT NULL,
"reason" VARCHAR(200),
"effective_date" DATE NOT NULL,
"expire_date" DATE NOT NULL,
"is_expired" BOOLEAN NOT NULL DEFAULT false,
"last_synced_at" TIMESTAMP(3),
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "pending_contribution_mining_pkey" PRIMARY KEY ("id")
);
-- CreateTable: 待解锁算力挖矿记录
CREATE TABLE "pending_mining_records" (
"id" BIGSERIAL NOT NULL,
"pending_contribution_id" BIGINT NOT NULL,
"mining_minute" TIMESTAMP(3) NOT NULL,
"source_adoption_id" BIGINT NOT NULL,
"source_account_sequence" VARCHAR(20) NOT NULL,
"would_be_account_sequence" VARCHAR(20),
"contribution_type" VARCHAR(30) NOT NULL,
"contribution_amount" DECIMAL(30, 10) NOT NULL,
"network_total_contribution" DECIMAL(30, 10) NOT NULL,
"contribution_ratio" DECIMAL(30, 18) NOT NULL,
"second_distribution" DECIMAL(30, 18) NOT NULL,
"mined_amount" DECIMAL(30, 18) NOT NULL,
"allocated_to" VARCHAR(20) NOT NULL DEFAULT 'HEADQUARTERS',
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "pending_mining_records_pkey" PRIMARY KEY ("id")
);
-- CreateTable -- CreateTable
CREATE TABLE "mining_reward_allocations" ( CREATE TABLE "mining_reward_allocations" (
"id" BIGSERIAL NOT NULL, "id" BIGSERIAL NOT NULL,
@ -316,6 +405,33 @@ CREATE INDEX "mining_transactions_counterparty_account_seq_idx" ON "mining_trans
-- CreateIndex -- CreateIndex
CREATE INDEX "mining_transactions_counterparty_user_id_idx" ON "mining_transactions"("counterparty_user_id"); CREATE INDEX "mining_transactions_counterparty_user_id_idx" ON "mining_transactions"("counterparty_user_id");
-- CreateIndex: system_mining_accounts
CREATE UNIQUE INDEX "system_mining_accounts_account_type_region_code_key" ON "system_mining_accounts"("account_type", "region_code");
CREATE INDEX "system_mining_accounts_totalContribution_idx" ON "system_mining_accounts"("totalContribution" DESC);
CREATE INDEX "system_mining_accounts_account_type_idx" ON "system_mining_accounts"("account_type");
CREATE INDEX "system_mining_accounts_region_code_idx" ON "system_mining_accounts"("region_code");
-- CreateIndex: system_mining_records
CREATE UNIQUE INDEX "system_mining_records_system_account_id_mining_minute_key" ON "system_mining_records"("system_account_id", "mining_minute");
CREATE INDEX "system_mining_records_mining_minute_idx" ON "system_mining_records"("mining_minute");
-- CreateIndex: system_mining_transactions
CREATE INDEX "system_mining_transactions_system_account_id_created_at_idx" ON "system_mining_transactions"("system_account_id", "created_at" DESC);
-- CreateIndex: pending_contribution_mining
CREATE UNIQUE INDEX "pending_contribution_mining_source_adoption_id_would_be_acco_key"
ON "pending_contribution_mining"("source_adoption_id", "would_be_account_sequence", "contribution_type");
CREATE INDEX "pending_contribution_mining_would_be_account_sequence_idx" ON "pending_contribution_mining"("would_be_account_sequence");
CREATE INDEX "pending_contribution_mining_contribution_type_idx" ON "pending_contribution_mining"("contribution_type");
CREATE INDEX "pending_contribution_mining_is_expired_idx" ON "pending_contribution_mining"("is_expired");
-- CreateIndex: pending_mining_records
CREATE UNIQUE INDEX "pending_mining_records_pending_contribution_id_mining_minute_key"
ON "pending_mining_records"("pending_contribution_id", "mining_minute");
CREATE INDEX "pending_mining_records_mining_minute_idx" ON "pending_mining_records"("mining_minute");
CREATE INDEX "pending_mining_records_source_account_sequence_idx" ON "pending_mining_records"("source_account_sequence");
CREATE INDEX "pending_mining_records_would_be_account_sequence_idx" ON "pending_mining_records"("would_be_account_sequence");
-- CreateIndex -- CreateIndex
CREATE INDEX "mining_reward_allocations_mining_date_idx" ON "mining_reward_allocations"("mining_date"); CREATE INDEX "mining_reward_allocations_mining_date_idx" ON "mining_reward_allocations"("mining_date");
@ -415,8 +531,27 @@ ALTER TABLE "mining_records" ADD CONSTRAINT "mining_records_accountSequence_fkey
-- AddForeignKey -- AddForeignKey
ALTER TABLE "mining_transactions" ADD CONSTRAINT "mining_transactions_accountSequence_fkey" FOREIGN KEY ("accountSequence") REFERENCES "mining_accounts"("accountSequence") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "mining_transactions" ADD CONSTRAINT "mining_transactions_accountSequence_fkey" FOREIGN KEY ("accountSequence") REFERENCES "mining_accounts"("accountSequence") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey: system_mining_records
ALTER TABLE "system_mining_records" ADD CONSTRAINT "system_mining_records_system_account_id_fkey"
FOREIGN KEY ("system_account_id") REFERENCES "system_mining_accounts"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey: system_mining_transactions
ALTER TABLE "system_mining_transactions" ADD CONSTRAINT "system_mining_transactions_system_account_id_fkey"
FOREIGN KEY ("system_account_id") REFERENCES "system_mining_accounts"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey: pending_mining_records
ALTER TABLE "pending_mining_records" ADD CONSTRAINT "pending_mining_records_pending_contribution_id_fkey"
FOREIGN KEY ("pending_contribution_id") REFERENCES "pending_contribution_mining"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey -- AddForeignKey
ALTER TABLE "burn_records" ADD CONSTRAINT "burn_records_blackHoleId_fkey" FOREIGN KEY ("blackHoleId") REFERENCES "black_holes"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "burn_records" ADD CONSTRAINT "burn_records_blackHoleId_fkey" FOREIGN KEY ("blackHoleId") REFERENCES "black_holes"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey -- AddForeignKey
ALTER TABLE "pool_transactions" ADD CONSTRAINT "pool_transactions_pool_account_id_fkey" FOREIGN KEY ("pool_account_id") REFERENCES "pool_accounts"("id") ON DELETE RESTRICT ON UPDATE CASCADE; ALTER TABLE "pool_transactions" ADD CONSTRAINT "pool_transactions_pool_account_id_fkey" FOREIGN KEY ("pool_account_id") REFERENCES "pool_accounts"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- 初始化系统账户 (无 regionCode 的汇总账户)
INSERT INTO "system_mining_accounts" ("id", "account_type", "region_code", "name", "totalMined", "availableBalance", "totalContribution", "updated_at")
VALUES
(gen_random_uuid(), 'OPERATION', NULL, '运营账户', 0, 0, 0, NOW()),
(gen_random_uuid(), 'HEADQUARTERS', NULL, '总部账户', 0, 0, 0, NOW())
ON CONFLICT ("account_type", "region_code") DO NOTHING;

View File

@ -0,0 +1,47 @@
-- CreateTable: 批量补发执行记录(全局只允许执行一次)
CREATE TABLE "batch_mining_executions" (
"id" TEXT NOT NULL,
"operator_id" TEXT NOT NULL,
"operator_name" TEXT NOT NULL,
"reason" TEXT NOT NULL,
"total_users" INTEGER NOT NULL,
"total_batches" INTEGER NOT NULL,
"success_count" INTEGER NOT NULL DEFAULT 0,
"failed_count" INTEGER NOT NULL DEFAULT 0,
"total_amount" DECIMAL(30,8) NOT NULL DEFAULT 0,
"executed_at" TIMESTAMP(3) NOT NULL,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "batch_mining_executions_pkey" PRIMARY KEY ("id")
);
-- CreateTable: 批量补发明细记录
CREATE TABLE "batch_mining_records" (
"id" TEXT NOT NULL,
"execution_id" TEXT NOT NULL,
"account_sequence" TEXT NOT NULL,
"batch" INTEGER NOT NULL,
"tree_count" INTEGER NOT NULL,
"pre_mine_days" INTEGER NOT NULL,
"user_contribution" DECIMAL(30,10) NOT NULL,
"network_contribution" DECIMAL(30,10) NOT NULL,
"contribution_ratio" DECIMAL(30,18) NOT NULL,
"total_seconds" BIGINT NOT NULL,
"amount" DECIMAL(30,8) NOT NULL,
"remark" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "batch_mining_records_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "batch_mining_records_execution_id_account_sequence_key" ON "batch_mining_records"("execution_id", "account_sequence");
-- CreateIndex
CREATE INDEX "batch_mining_records_batch_idx" ON "batch_mining_records"("batch");
-- CreateIndex
CREATE INDEX "batch_mining_records_account_sequence_idx" ON "batch_mining_records"("account_sequence");
-- AddForeignKey
ALTER TABLE "batch_mining_records" ADD CONSTRAINT "batch_mining_records_execution_id_fkey" FOREIGN KEY ("execution_id") REFERENCES "batch_mining_executions"("id") ON DELETE RESTRICT ON UPDATE CASCADE;

Some files were not shown because too many files have changed in this diff Show More