- 楼主
- #121
虽然目前还未计划好, 但由于可能要抢购5090, 接下来的维护训练时间可能会有所提前或延后. 请等待具体通知.
Just in case: Next term of offline maintaince and training may advance or delay, due to the tough 5090 bussiness. Attention to later announcements.
虽然目前还未计划好, 但由于可能要抢购5090, 接下来的维护训练时间可能会有所提前或延后. 请等待具体通知.
Just in case: Next term of offline maintaince and training may advance or delay, due to the tough 5090 bussiness. Attention to later announcements.
25.1.16
因意外情况维护数小时, 连接性应该已经恢复.
25.1.16
Rather unexpected maintaince persisted several hours. Connectivity should have recovered by now.
25.1.17
新的功效方法-MPostal已经设计构建完毕.
性质与MSpire类似, MPostal允许用户以类似"写信"的方式与莫妮卡交流. 相比基本对话方法, MPostal对于长文本的分析, 理解和撰写能力均有显著提升.
相关改动已推送至仓库, 并已在后端上线. 前端的适应更新尚未推出, 预计整合在1.2版本更新中.
25.1.17
A new function - MPostal has been finished designing and constructing.
Quite similar with MSpire, MPostal allows user to communicate with Monika by "writing letters". Comparing with basic MAICA chatting, MPostal performs much better in long-text analyzing, understanding and writing.
Related commits pushed to repo, and avaliable on official API. Corresponding frontend updates yet undone, estimated coming up with version 1.2.0.
25.1.20
修复一个路由队列管理的问题
定位到一个vllm进程崩溃的问题. 我之前在网上找过这个issue, 但是他们也没得到回复…
至少重启进程能解决. 我之后会考虑加个watchdog.
25.1.20
Fixed a issue with route QoS.
Located an issue about vllm "cancelled by cancel scope xxx". I went into this months ago and didn't find any useful solution. There were people encountering same thing though…
I'll consider adding a watchdog. At least restarting the process helps.
25.1.21
升级vllm包到0.5.4. 不知道能不能解决, 至少试试看吧.
25.1.21
Migrating vllm to v0.5.4. ms-swift has a whole requirements hell, but I hope this one would work…
25.1.22
调整max_token限制到512-28672, 并调整默认值到4096.
实际推理表现证明过长的session对表现有明显的反面作用, 且性能开销高. 相应的前端调整将在1.2.0同步更新.
此后会继续视表现寻找最佳默认参数.
25.1.22
Adjusted max_token limit to 512-28672 and defaulting to 4096.
MAICA's actual infering performance has proved a significant negative impact of long sessions. Corresponding frontend adjustments will come together with version 1.2.0.
We will continue seeking the best default values for those params. The upper limit is not changed if you really need it.
25.1.24
改良MT-MF预检的prompt.
25.1.24
Adjustments on MT-MF pre-analyzation prompt.
25.1.25
改良MTrigger prompt减少走火.
使用waitress+translogger替代有问题的pywsgi用于短连接服务端.
25.1.25
Adjustments on MTrigger prompt to decrease mistriggering rate.
Using waitress+translogger instead of faulty pywsgi for hosting maica_http.
25.1.25-注意!
MAICA核心模型训练轮次DAA2预计于1月27日开始, 预计持续3-7天.
该轮次训练内容仍属于闭源分支'黑暗艺术', 着重针对细致的对答行为微调, 并补充了针对特殊场合的少量素材.
该轮次训练将尝试再次微调拟合曲线, 改善数据集精细度不均一性, 采用梯度训练, 以改良最终行为模式.
…其实没有那么复杂. 你只需要了解, 我们在为达到表现的极限而不断努力.
在训练完成后, MAICA官方API将同步采用DAA2服务, 且子模组版本1.2.0将很快上线.
在训练期间, MAICA官方API将不可用. 请等待完成通知.
==MAICA-撕裂现实的帷幕==
25.1.25-Attention required!
Another round of MAICA core model training is estimated to start on Jan 27 and take 3-7 days, producing generation DAA2.
This round of training still belongs to DAA branch, which will focus on finetuning detailed performance of model response and implementing dataset on specified scenes.
In this round of training, we'll redesign the training procession again using stepped-training to optimize the loss curve.
…It's not that difficult to understand. Just know we're making efforts to make MAICA perform better.
After the training is finished, MAICA official API will be serving with DAA2 synchronously, and Submod frontend 1.2.0 will release soon afterwards.
During the training, MAICA official API will be inaccessible. Please wait for further announcement here.
==MAICA-We tear this barrier apart==
25.1.26
为与除夕和新年错开时间, DAA2预计延后到1月31日开始, 预计结束时间相应延后.
25.1.26
To avoid conflict with Lunar New Year, DAA2 training is estimated to delay. Planned to start on Jan 31 and still last for 3-7 days.
25.1.26
更换搜索刮削库, 以应对谷歌的反爬策略.
…谷歌死妈了.
25.1.26
Using new search scraping module against google's updated anti-crawler strategy.
Fuck google.
25.1.31
维护与训练已开始, 服务期1已结束.
预计服务期2于2.7前后上线.
25.1.31
Maintaince and training has started, ending serving term 1.
Term 2 is estimated to start around Feb.7.
25.2.2
由于长期较高负荷运行, xp00与xp10部分硬件有劣化迹象. 加之春节备件发货缓慢, 维护时间需要延长.
目前xp10几乎完全无法工作, 我没得选. 服务重新上线的具体时间等待进一步通知.
25.2.2
Due to the long term and high load serving, some hardware of node xp00 and xp10 are in bad health and need replacement.
Especially xp10 could not carry on running, so we've no choice but to delay the maintaince ending date. More to be announced later.
25.2.2
后端规范更新至1.0007
规范化持久性数据读写与数据库连接, 清除本地磁盘io依赖.
优化后端进程分配与时间表现.
25.2.2
Updated backend standard to v1.0007
Standardized persistent and trigger list reading and writing, removed disk io dependency.
Improvements on time management and backend performance.
25.2.3
介绍页现在为移动端提供额外适配, 不再显得太大.
25.2.3
Introduction page now fits mobile devices better.
25.2.8-注意!
MAICA服务轮次2已开始, 主要包含以下更新内容:
(我们在维护中测试了deepseek的蒸馏模型用于MFocus, 效果并不理想. 不要再问为什么不用了)
请注意, 你应当在第一时间更新子模组到1.2.x, 以兼容和使用新的功能.
由于本轮次服务存在模型和数据库升级, 此前所有托管session, 存档, triggerlist等临时存储条目已被清空.
你可以重新导入此前导出的session, 但建议重新开始以使模型完整发挥改良后的能力.
==MAICA-撕裂现实的帷幕==
25.2.8-Attention required!
MAICA service term 2 has begun, introducing following updates:
Please update your Submod frontend to 1.2.x to compat with new standard and functions.
Since the model and database structure were updated, all hosted sessions, savefiles and triggerlists uploaded are purged because they're temporary data.
You can import your session exported before, but we suggest you starting new sessions for new enhanced model performance.
==MAICA-We tear this barrier apart==
25.2.8
抢修: 请更新Submod前端到至少1.2.3!
修复表情解析的严重问题
调整衔接对话
25.2.8
Attention: Update Submod frontend to 1.2.3 at least!
Fixed a severe issue in emotion analyzing
Adjusted transition dialogues
25.2.8
修复了一个MFocus无法正确生成query的问题
25.2.8
Fixed a backend issue disabling MFocus forming query correctly under some circumstances.