NUS partner AI Challenge x student builders

Co-author an AI Challenge track for Acacia AI Symposium 2026.

Bring a real R&D problem to student AI builders for six weeks before the NUS finale in Singapore on 17-18 August 2026.

200+student teams
3partner-authored tracks
6 weeksof build signal
500+Symposium attendees
NUSSingapore finale

Partner problem gap

Hiring filters find model builders. Hard problems reveal engineers.

A track turns one real R&D problem into six weeks of observable build signal. You see teams reason, scope, debug, trade off, and present before any interview begins.

View challenge format
01

Ambiguous problem framing

02

Messy data and constraints

03

Engineering judgment under pressure

04

Communication with real stakeholders

29 Jun - 18 Aug 2026

Challenge architecture

Real problems, real constraints, and a six-week window long enough to expose judgment beyond polished demos.

Co-author a track ->

Example tracks

Challenge briefs should feel like systems work, not homework.

Each track starts from a partner's real pipeline: ambiguous inputs, imperfect data, hard constraints, and a scoring rubric that rewards engineering judgment.

RAG-01Retrieval systems

Grounded enterprise QA under adversarial conditions

Teams build against contradictory documents, prompt injection, stale sources, and abstention rules across a large corpus.

  • 50k source documents
  • Hallucination penalty
  • Adversarial evaluation
EDGE-02Edge AI

Real-time anomaly detection on constrained hardware

Teams deploy a compact vision pipeline for manufacturing defects under latency, memory, and domain-shift constraints.

  • Sub-200ms inference
  • 4GB memory ceiling
  • No cloud fallback
SAFE-03AI safety

Decision support that knows when not to answer

Teams design a system that escalates uncertainty, explains tradeoffs, and keeps operators in the loop for high-risk calls.

  • Traceable decisions
  • Human review paths
  • Failure-mode reporting

Your R&D pipeline

Bring the problem you cannot evaluate from a resume.

We help convert the problem into a public challenge brief, data access model, judging rubric, and finalist mentorship path.

Partner outcomes

The sponsorship product is technical evidence.

The event package matters less than the signal you create. Every partner path should leave behind proof of builders, systems, and judgment.

Start a conversation ->
200+student teams
6week challenge
500+finale attendees
3partner-authored tracks
01

Convert open problems into recruiting signal

A challenge track shows how candidates scope ambiguity, debug constraints, and communicate tradeoffs under a real brief.

02

See six weeks of work, not a polished pitch

Partners can review submissions, observe finalists, and meet teams after they have already produced technical artifacts.

03

Put your platform inside the build path

Infrastructure, data, model, and tooling partners can make their systems the default environment for high-agency builders.

04

Anchor technical visibility in a hard problem

Talks, demos, panels, and booths point back to the challenge, giving your presence a concrete technical center of gravity.

Event appendix

The wider Symposium stays compact, technical, and useful.

01

Program frame

  • AI Challenge opens 29 June 2026
  • Symposium finale on 17-18 August 2026
  • National University of Singapore
02

Audience

  • AI and engineering students
  • Researchers and faculty
  • Industry leaders and technical operators
03

Symposium themes

  • Intentional AI systems
  • Data scarcity and evaluation
  • Safe deployment and inference
04

Organisers

  • Acacia AI Society
  • NUS SoC AI Society
  • StartIT

Contact

Co-author a track from a real technical problem.

Tell us what signal you want to create. We will help shape the brief, timeline, evaluation plan, and partner involvement around it.

3Challenge tracks
500+Symposium attendees
200+Competing teams
6Universities

FAQ

Common partner questions.

Ask us directly ->

AI and engineering leaders deciding whether to co-author a real challenge from their R&D pipeline, sponsor a technical track, or meet builders through evidence of work.

You provide a real problem, dataset shape, constraints, and evaluation goals. We help turn that into a public brief, then you judge submissions and mentor the strongest finalist teams.

A booth conversation shows interest. A six-week challenge shows how teams reason, build, debug, communicate, and recover when constraints are real.

Yes. Tracks can use synthetic data, public proxies, held-out evaluations, API sandboxes, or scoped access so the core engineering problem is real without exposing sensitive systems.

The AI Challenge opens on 29 June 2026. Track partners should start problem shaping early enough to prepare the brief, data access model, judging rubric, and mentor cadence.

The event is organised by Acacia AI Society with NUS SoC AI Society and StartIT at the National University of Singapore.

Yes. Partners can support talks, recruiting access, platform credits, workshops, booths, or the finale. The strongest packages still connect back to technical evidence.