Visually stunning for designers, robust enough for complex architectures, and optimized for real-time AI streaming.
Demo β’ Installation β’ AI Integration β’ Stack UI β’ Themes
NotifyX v4 is engineered from the ground up to solve the challenges of modern web applications. Whether you are building an LLM-powered agent workflow, a Next.js SaaS, or an interactive dashboard, NotifyX delivers a pristine user experience.
| Feature | NotifyX v4 | react-hot-toast | sonner | react-toastify |
|---|---|---|---|---|
| Stack-Based UI | β Elegant 3D | β | β | β |
| AI/Streaming API | β Native | β | β | β |
| MCP-Ready | β | β | β | β |
| Web Animations API | β GPU Accelerated | β | limited | limited |
| Theme System | β 6 robust themes | β | partial | partial |
| Vanilla JS Support | β | β | β | partial |
| Priority Queue | β | β | β | β |
| Zero Dependencies | β | β | β | β |
| Bundle < 7KB | β | β | β | β |
# npm
npm install notifyx
# pnpm
pnpm add notifyx
# bun
bun add notifyximport NotifyX from "notifyx";
import "notifyx/style.css"; // Required for UI styling
NotifyX.success("Ready for the AI era! π");NotifyX v4 departs from legacy list-based rendering, introducing a Stack-Based Architecture.
Notifications gracefully stack with dynamic transform: scale() and translateY() 3D layering, saving vertical screen real estate while feeling incredibly premium.
Under the hood, the Priority Queue Manager handles influxes of notifications perfectly. If an application throws 20 events at once, NotifyX will instantly display the allowed maximum (maxToasts), gracefully holding the rest in memory and seamlessly rendering them as active toasts are dismissed.
NotifyX provides first-class support for AI metadata, rendering Model Context Protocol (MCP) tool calls, token counts, and latency flawlessly.
NotifyX.ai("Processing context...", {
ai: {
model: "claude-3-5-sonnet",
toolName: "read_file",
latencyMs: 1240,
tokens: 450,
},
});It renders elegantly with a custom β¦ icon, specialized color palettes, and a structured metadata bar showcasing agent workflow steps.
LLMs stream responses token-by-token. NotifyX's StreamingBridge ensures smooth, jank-free progressive text rendering.
const stream = NotifyX.stream("Thinking...", {
ai: { model: "gpt-4o", streaming: true },
position: "bottom-right",
});
// Stream chunks as they arrive
stream.update("I have found ");
stream.update("the specific bug ");
stream.update("in your code.");
// Finalize
stream.success("Analysis complete!", {
ai: { streaming: false, latencyMs: 850 },
});A blinking β cursor is natively rendered while streaming is active!
Manage asynchronous workflows beautifully.
NotifyX.promise(fetch("/api/user/profile"), {
loading: "Loading profile...",
success: "Profile loaded!",
error: "Failed to fetch profile",
});NotifyX v4 ships with 6 distinct, highly-polished themes powered by Vanilla CSS Custom Propertiesβno Tailwind required.
auto(Default) β Adapts to OS preferencelightβ Pristine, modern whitedarkβ Deep, sleek dark modeglassβ Premium frosted glassmorphismminimalβ Flat, pure content focusbrutalβ Monochrome, monospace brutalist
Usage:
// Globally
NotifyX.setTheme("glass");
// Per-Toast
NotifyX.success("Task complete", { theme: "brutal" });Powered by the Web Animations API for buttery-smooth 60fps rendering, bypassing Main Thread blocking.
spring(Default) β Bouncy, physical, livelyslideβ Smooth directional translationbloomβ Elegant scale and fadeflipβ 3D spatial rotationfadeβ Simple opacity transition
NotifyX.info("System update", { animation: "bloom" });Because NotifyX v4 is built on a modular architecture, we expose the underlying subsystems directly on the NotifyX object for advanced control:
NotifyX.queue: Access the Min-Heap priority queue (ToastQueue) instance. Manage overflow, peak queued states, or manually flush items.NotifyX.animation: Access theAnimationEnginedirectly. Hook into the Web Animations API (WAAPI) engine to applystaggerEnter,pulse, orshakeanimations to your own UI elements.NotifyX.stream_bridge: Access theStreamBridgeutility. Use itsfromIterableandpipehelpers to map arbitrary LLM data streams directly into DOM nodes.
NotifyX.configure({
theme: "glass",
animation: "spring",
position: "top-right",
duration: 4000,
maxToasts: 3,
pauseOnHover: true,
});