Artificial Intelligence & Machine Learning / Open-source AI models

Weekly Artificial Intelligence & Machine Learning / Open-source AI models Insights

Stay ahead with our expertly curated weekly insights on the latest trends, developments, and news in Artificial Intelligence & Machine Learning - Open-source AI models.

Recent Articles

Sort Options:

Deep Cogito v2: Open-source AI that hones its reasoning skills

Deep Cogito v2: Open-source AI that hones its reasoning skills

Deep Cogito has launched Cogito v2, a groundbreaking open-source AI model family that enhances its reasoning abilities. Featuring models up to 671B parameters, it employs Iterated Distillation and Amplification for efficient learning, outperforming competitors while remaining cost-effective.


What is Iterated Distillation and Amplification (IDA) and how does it improve Deep Cogito v2's reasoning?
Iterated Distillation and Amplification (IDA) is a training technique where the AI model internalizes the reasoning process through iterative policy improvement rather than relying on longer search times during inference. This method enables Deep Cogito v2 models to learn more efficient and accurate reasoning skills, improving performance on complex tasks such as math and language benchmarks while remaining cost-effective.
Sources: [1]
What does it mean that Deep Cogito v2 models are 'hybrid reasoning models'?
Deep Cogito v2 models are called hybrid reasoning models because they can toggle between two modes: a fast, direct-response mode for simple queries and a slower, step-by-step reasoning mode for complex problems. This hybrid approach allows the models to efficiently handle a wide range of tasks by balancing speed and depth of reasoning, outperforming other open-source models of similar size.
Sources: [1], [2]

01 August, 2025
AI News

Leak suggests OpenAI’s open-source AI model release is imminent

Leak suggests OpenAI’s open-source AI model release is imminent

A recent leak hints at OpenAI's imminent launch of a powerful open-source AI model, featuring a 120-billion-parameter architecture. This move could redefine the competitive landscape, appealing to developers and researchers while returning to the company's open-source roots.


What does it mean that OpenAI's new AI model uses a Mixture of Experts (MoE) architecture?
The Mixture of Experts (MoE) architecture means the AI model is designed like a board of 128 specialist advisors rather than a single monolithic system. When a query is received, the system selects the four most relevant experts to handle the task. This approach allows the model to have a very large number of parameters (120 billion) for vast knowledge while maintaining speed and efficiency by activating only a small part of the model at a time.
Sources: [1]
Why is OpenAI releasing an open-source AI model now, and how is this significant?
OpenAI's release of a powerful open-source AI model marks a return to its original open-source roots after years of guarding its top-tier models. This move is significant because it could reshape the competitive landscape by making advanced AI technology more accessible to developers and researchers, fostering innovation and collaboration in the AI community.
Sources: [1]

01 August, 2025
AI News

An unhandled error has occurred. Reload 🗙