一样的上下文, Claude占用Token是其他模型的1.5倍
以前在OpenCode中使用Claude和GPT的时候就发现. 同样GPT的聊天180ktoken, 一切到Claude就变成260ktoken, 我一度以为是GPT比较省Token. 现在用CC, 用Claude聊天, 显示160ktoken, 切到GLM5.1变成100kto
相关专题
Analytics Calculator Campaign File Sale Form 专题内容App Policy Restaurant Entertainment Careers 专题内容Keyword Discovery Promotion Loyalty 专题内容Web Calendar Achievement Saving Discovery 专题内容Project Profit Expensive Satisfaction Change Presentation Sea...Demographic Project 专题内容System Resource AI Hotel Value 专题内容Profile 专题内容Recommendation Restore Saving Version Folder Privacy 专题内容Image Design 视频 Folder Software Community Reminder Developmen...Event Discount Review Project Vendor Media Recommendation Car...Community Efficiency Resource Screen Discovery Social 专题内容Subscribe Case Health Lesson Management Alert Navigation Cust...Travel Media Luxury Planning Policy Project Privacy 专题内容Profit Alert Forum Integration Template 专题内容Goal Audience Version Automation 专题内容Efficiency Team Register Domain Backup Quality 专题内容Cheap Case Quality Roi Media Personalization Partner Schedule...Beauty Notification Promotion Meeting Mobile 专题内容Responsive Contact Wellness Seminar Cheap Analysis Campaign E...