AI Glossary

H

AI Terms Starting With “H”

3 terms defined

Hallucination

Core AI

When an AI model generates information that sounds plausible but is factually incorrect or entirely fabricated. Hallucinations occur because LLMs generate text based on statistical patterns rather than verified facts. Always verify AI-generated factual claims, especially for legal, medical, or financial content.

Example: An AI might confidently cite a research paper that does not exist, or state an incorrect statistic.
Related:Large Language ModelRetrieval-Augmented Generation (RAG)Grounding

HeyGen

AI Video

An AI video generation platform specializing in AI avatars and video translation. HeyGen allows users to create professional talking-head videos with AI presenters, translate existing videos into 40+ languages while preserving lip sync, and clone their own appearance and voice for scalable video content.

Related:AI AvatarAI Voice CloningText-to-Video

Human-in-the-Loop (HITL)

AI Agents

An AI system design where a human is required to review, approve, or correct AI outputs at key decision points before the system proceeds. HITL is essential for high-stakes applications (legal, medical, financial) where AI errors are costly. It balances AI efficiency with human judgment and accountability.

Example: An AI that drafts a commercial lease abstract but requires a human attorney to approve it before it is sent to the client.
Related:AI GovernanceAI AutomationAutonomous AgentAI Safety

Ready to Apply These AI Concepts?

Learn how to implement AI in your business, career, or real estate practice.