8 links
tagged with all of: deepseek + ai
Click any tag below to further narrow down your results
Links
TNG Technology Consulting GmbH has unveiled R1T2, a new variant of DeepSeek R1-0528 that operates 200% faster while maintaining high reasoning performance. With significant reductions in output token count and inference time, R1T2 is tailored for enterprise applications, offering an open-source solution under the MIT License.
The Huawei DeepSeek R2, powered by the Ascend AI chip, is expected to be officially launched between August 15 and 30. This new AI reasoning model is anticipated to compete with OpenAI's ChatGPT 5, signaling a significant advancement in AI technology.
Reflection has successfully raised $2 billion to establish itself as a leading AI lab in the U.S., aiming to compete with industry giants like DeepSeek. The funding will support the development of innovative AI technologies and research initiatives that challenge existing paradigms in the field.
Microsoft AI has introduced MAI-DS-R1, a new variant of the DeepSeek R1 model, featuring open weights and enhanced capabilities for responding to blocked topics while reducing harmful content. The model demonstrates significant improvements in responsiveness and satisfaction metrics compared to its predecessors, making it a valuable resource for researchers and developers.
Hangzhou has emerged as a leading tech center in China, largely due to its entrepreneurial spirit and talent pool. The city's transformation was marked by the success of local company DeepSeek, which developed an AI model that competes with American technology at a lower cost, igniting excitement among local entrepreneurs.
DeepSeek aims to launch its AI agent by the end of 2025, positioning itself as a competitor to OpenAI. The company is focusing on developing advanced AI capabilities to challenge existing players in the market, particularly in the realm of conversational agents.
DeepSeek has launched its Terminus model, an update to the V3.1 family that improves agentic tool use and reduces language mixing errors. The new version enhances performance in tasks requiring tool interaction while maintaining its open-source accessibility under an MIT License, challenging proprietary models in the AI landscape.
DeepSeek V3 is a 685B-parameter, mixture-of-experts model that represents the latest advancement in the DeepSeek chat model family. It succeeds the previous version and demonstrates strong performance across various tasks.