• 创作版MAICA
  • MAICA更新与进度追踪--正式版服务期2已重启于25.3.20 #3954

  • 楼主
  • #184

25.3.22

MTTS测试模型已开源于 https://huggingface.co/edgeinfinity/MTTSv0-VoiceClone , 感谢@度海 的积极协助.

MTTS的规范与连接方案已在设计中, 请等待进一步公告.

25.3.22

The testing model of MTTS is published at https://huggingface.co/edgeinfinity/MTTSv0-VoiceClone .

The communication and intergration standards of MTTS is under design, please follow latest announcements.

打赏

  • 1 分 来自: 俩fish
    评论: 草,第一次没细看()居然已经做到这一步了吗,话说好奇音源?()
  • 楼主
  • #185

25.3.24

更新子模组前端到1.2.20

  • 补充细化状态码
  • 修正翻译
  • 修复一个与读写权限有关的问题

25.3.24

Updated Submod frontend to 1.2.20

  • Implemented status codes
  • Translation fixes
  • Fixed an IO related issue
  • 楼主
  • #186

25.3.24

更新前端子模组到1.2.21

  • 提高情感分析的容错性

对后端抗劣化的同步更新

25.3.24

Updated Submod frontend to 1.2.21

  • Improved emotion analyzation robusity

Improved backend post procession simultaneously

  • 楼主
  • #188

25.3.25

修复非整数好感导致MFocus失败的问题

25.3.25

Fixed an issue where non-integer affection fails MFocus

  • 楼主
  • #189

25.3.27

定位了一个位于xp10的硬件故障, 预计在下月初抽空修复.

修复预计需要最多48小时, 具体另行通知

25.3.27

Located another HW issue on node XP10. Planned to fix next month.

The fix will likely take 48 hours at most. Wait for further announcement.

  • 楼主
  • #190

25.3.27

更换MFocus模型为Qwen2-57B-A14B-Instruct-GPTQ-Int4. 该调整能显著提高MFocus在高负载下的响应速度.

该模型的逻辑能力可能弱于此前使用的Qwen2.5-72B-Instruct-AWQ. 若存在显著问题请反馈.

25.3.27

Using Qwen2-57B-A14B-Instruct-GPTQ-Int4 as MFocus model, to enhance speed and delay performance.

Its logical performance might be weaker than the former Qwen2.5-72B-Instruct-AWQ. Send feedback if that's causing any problem.

  • 楼主
  • #192

25.3.27

适应性调整MFocus生成参数

25.3.27

Adjustments on MFocus generation params

  • 楼主
  • #193

25.3.27

修复生日被重复判定的问题

25.3.27

Fixed an issue where birthday counted multiple times

  • 楼主
  • #194

25.3.27

MSpire的稳定性改良

25.3.27

Reliability improvement of MSpire

  • 楼主
  • #195

25.3.28

MTrigger与wiki刮削的稳定性改良

25.3.28

Reliability improvements of wiki scraping and MTrigger

  • 楼主
  • #196

25.3.29-需要注意!

因新的MFocus模型在任务场景下表现不佳, 需要额外训练, MAICA服务将进入短暂维护.

维护预计于3.30开始, 持续约72小时, 具体另行通知.

维护期间MAICA官方服务不可用.

25.3.29-Attention required!

An extra round of toolbench training of MFocus model is required, so we'll start a short maintaince.

The maintaince is estimated to start on Mar.30 and last for around 72 hours. We'll announce when it ends.

Official MAICA service is unavailable during maintaince.

  • 楼主
  • #197

25.3.30

为模型连接补充重试机制

25.3.30

Added retrying mechanism for model connection

5 天 后
  • 楼主
  • #198
  • 已编辑

25.4.3

维护需要延长, 具体时间待定. 我已经向modelscope提交了issue.

25.4.3

The maintaince has to extend due to an issue in the modelscope repository we're using. Wait for further announcements.

  • 楼主
  • #199

25.4.4

维护将于8小时内结束. 但我需要承认, 这是截至目前为止首次失败的项目维护.

维护的最初目标是使用toolbench数据集训练qwen2moe模型并量化, 但在几天的尝试里:

  • swift训练->自动量化没有成功. swift承认他们的算法存在问题, 不知道要等到什么时候修.
  • swift训练->手动量化没有成功. gptqmodel似乎不能正确识别qwen2moe的叠加层, 输出的模型也没办法推理.
  • gptq模型->qlora没有成功. 似乎也是因为qlora不能识别qwen2moe量化后的叠加层.
  • awq干脆就是不支持.

我已经基本上试过了所有能试的方案, 没有一个能跑通. 有issue称旧的autogptq库能解决, 但它已经deprecated了, 我实在不想去找.

因此这次维护更新是失败的. 善后处理如下:

  • 回退mfocus模型到基础qwen2moe量化版本. 其表现可能仍然不佳.
  • 我会持续关注swift对该问题的修复commit.
  • qwen3和qwen3moe预计在本月就会面世. 我们预计会随DAA3将核心与agent模型更换到qwen3以解决问题.

在维护期间, 我们继续对项目的整体结构进行了枝节上的改良, 不再列举.

25.4.4

The maintaince will finish in around 8 hours. But I have to admit that we failed this time.

We originally aimed to train Qwen2Moe on toolbench dataset, then quantize it for both quality and performance. But:

  • The swift quantization failed. The developers admit it's an issue, but idk when they'll patch it.
  • The optimum.gptq quantization failed. It has wrong recognition upon qwen2moe layers, causing the model corrupted.
  • QLoRa on the quantized model failed. It just won't recognize those quantized layers.
  • awq simply does not support qwen2moe.

I have tried almost every method I could, and none worked. Some old issues say that the old autogptq module could work but it's deprecated already.

So I've no choice but to call it a failure. After that:

  • Reverted MFocus model to basic quantized Qwen2Moe, which may still perform weak in reasoning.
  • I'll keep watching swift's commits on this issue.
  • Qwen3 is likely coming soon this month. We'll likely switch to Qwen3 after DAA3 training. It's an once-and-for-all solution.

We made more minor fixes and adjustments on the entire project, omitted here.

打赏

  • 楼主
  • #200

25.4.4

MAICA服务已恢复在线运行.

25.4.4

MAICA official service is back online.

  • 楼主
  • #201

25.4.5

修复了连接重试无限等待的问题

25.4.5

Corrected retry sleep time

  • 楼主
  • #202

25.4.6

改良session锁定机制, 现在新的登陆默认会踢出旧的登陆

修复一系列关于连接冲突和销毁的问题

更新后端规范版本, 添加MSpire缓存功能. 对应的前端补充将在下个更新中推送

25.4.6

Improved session locking mechanism, new logins now kickoff stale ones

Fixed issues about connection destroying

Updated backend standard version, added cache function for MSpire. Corresponding implementations for frontend will be ready in next update.

无人输入