Text Generation
  • Home
  • About
Sign in Subscribe
Newsletter

Obsidian-3B: The Smallest Multimodal LLM

Can a 3B model compete with the giants?

Donato Riccio

07 Dec 2023 — 1 min read
Obsidian-3B: The Smallest Multimodal LLM
Photo by Zoltan Tasi on Unsplash

This post is for paying subscribers only

Subscribe now

Already have an account? Sign in

Read more

Chain-of-Thought Prompting: Not the Universal Solution We Thought

CoT - where AI models explain their reasoning step by step - has become the default approach for complex AI tasks. But new research reveals it might not be the silver bullet many thought it was.

By Donato Riccio 03 Dec 2024
Why Size Matters in LLMs?

Why Size Matters in LLMs?

How the Number of Parameters Influences LLM Performance

lock-1 By Donato Riccio 20 Nov 2024

Rerank for Better RAG

Methods to enhance relevance and diversity in your retrievals

By Donato Riccio 30 Oct 2024

Do We Have Enough Data to Train LLMs?

Training optimal Large Language Models

lock-1 By Donato Riccio 12 Oct 2024
Text Generation
  • Sign up
Powered by Ghost

Text Generation

The newsletter for AI Engineers, by AI Engineers. Subscribe now to build a deep understanding of the latest AI models, delivered straight to your inbox