Call for System Demonstrations
ACM Conference on AI and Agentic Systems (ACM CAIS 2026)
San Jose, CA • May 26–29, 2026
Overview
The CAIS 2026 System Demonstrations track invites submissions showcasing innovative AI systems, compound AI architectures, and autonomous agents. We welcome demonstrations of both research prototypes and production systems that advance the state of the art in AI systems and agentic systems.
CAIS is uniquely positioned at the intersection of AI, systems, and engineering—and our demo track reflects that. We are looking for demos that show real systems doing real things, whether that means a novel inference serving architecture, a multi-agent coordination framework, a production monitoring tool, or an interactive autonomous agent.
Topics of Interest
We invite demonstrations across two broad categories:
Agent Demonstrations
- Autonomous AI agents with formalized goal-driven behavior
- Novel multi-agent systems with emergent coordination and interaction protocols
- Integration of agents with external real-world systems, sensors, or hardware
- Evaluated agent behavior with principled benchmarking or user studies
- Agents with persistent memory, learning, and adaptation over time
- …and more! We welcome other demos that push the boundaries of what AI agents can do!
AI Systems Demonstrations
- Novel inference serving systems with efficiency or scalability advances
- Compound AI system architectures with principled composition
- Data and ML pipeline systems for training, fine-tuning, or curation at scale
- Production AI system monitoring, evaluation, and debugging tools
- Optimization and adaptation of AI systems for deployment constraints
- …and more! We welcome other demos that advance the state of the art in building, deploying, or operating AI systems!
Submission Guidelines
Paper
- 4 pages of content, plus unlimited additional pages for references and appendices
- Accepted papers will receive 1 additional page for the camera-ready version
- Use the standard ACM sigconf format (double-column!)
- Submissions are single-blind (author names are visible to reviewers)
What to Include in Your Paper
Your submission should cover:
- What your system does and why it matters. Describe the problem, your approach, and what makes it different from existing systems or tools.
- Architecture and technical details. Provide enough detail for a knowledgeable reader to understand how the system works, including key design decisions and trade-offs.
-
Demonstrating your system's contribution. Show evidence that your system works and is meaningfully better or different from alternatives, using metrics appropriate to what you built. For example:
- Inference system: latency, throughput, cost per query at scale
- Agent framework: task completion rates, number of tool calls, error recovery
- Monitoring tool: detection latency, false positive rate, coverage
- RAG pipeline: retrieval accuracy, end-to-end response quality, indexing speed
- Interactive agent demo: success rate on representative tasks, qualitative walkthroughs
Formal user studies are welcome but not required.
- Demo scenario. Describe what attendees will see and experience at the live demonstration. What is the interaction? What is the most compelling thing to show?
Video
- A demo video is required (3–5 minutes)
- The video should show the system in action end-to-end, not just slides or architecture diagrams
- Upload the video to a hosting platform (e.g., YouTube, Vimeo) and include the link in your submission; a private YouTube link may be used for this purpose and attached as a footnote in the main paper
Live Demo Artifact
Authors are strongly encouraged to provide a live demo URL, hosted notebook, or installable package that reviewers can interact with. Accessible artifacts help reviewers better evaluate the system and strengthen the submission.
If a live artifact is not feasible (e.g., due to infrastructure requirements or proprietary constraints), authors should explain why in the submission and ensure the demo video clearly demonstrates end-to-end functionality.
How Demos Will Be Reviewed
Demo submissions may emphasize technical depth (novel architecture, significant engineering contribution, production-scale challenges) or audience experience (interactive, compelling live demonstration, clear real-world impact)—or both. Reviewers will evaluate each submission based on its strengths. A technically deep infrastructure demo should not be penalized for being less visually flashy, and a highly interactive agent demo should not be penalized for having a simpler architecture, as long as each excels in its primary dimension.
Authors are encouraged to indicate which dimension their demo primarily demonstrates:
- (a) Novel technical contribution
- (b) Compelling interactive experience
- (c) Both
Review Criteria
- Relevance: Does the system address problems in AI systems, compound AI, or agentic systems?
- Novelty: Does the system introduce new ideas, architectures, or approaches? Has this system been demonstrated at another venue before?
- Technical quality: Is the system well-designed? Are the design decisions and trade-offs clearly explained?
- Evidence of contribution: Does the submission include appropriate metrics or evaluation demonstrating the system works as described?
- Demo quality: Is the demo scenario well-designed? Will it be compelling for a live audience? Is the video clear and informative?
Presentation
Accepted demos will be presented as live demonstrations during poster and demo sessions, with dedicated demo stations. Authors should be prepared to give live demonstrations and answer questions from attendees throughout the session.
For demos that do not translate well to an interactive booth format—for example, systems requiring specialized hardware, large-scale infrastructure, or extended setup—there may be an option to present as a 5–10 minute demo talk during a scheduled demo talk session. Authors who believe their demo is better suited to this format should discuss this with the demo chairs after paper acceptance.
Important Dates
| Demo submission deadline | Fri, March 13, 2026 (11:59 PM AoE) |
| Notification of acceptance | Fri, April 10, 2026 |
| Camera-ready deadline | Mon, April 27, 2026 (11:59 PM AoE) |
Submission Site
Submissions will be handled via a dedicated demo track HotCRP site. The submission link will be available shortly.