Once your AICC files were ready —
.crs
,
.au
,
.des
, and their companions — the next step was to make them talk to an LMS.
This was where AICC came to life.
The specification didn’t just define how to describe a course — it defined how to run it.
🗂️ The Role of the
.au
File
Every AICC course was made up of one or more Assignable Units (AUs) — standalone lessons, simulations, or assessments.
Each AU was described in the
.au
file, which acted like a launch manifest. It defined:
The title of the learning unit
The URL or file path to the launch content
Parameters to pass to the course (like student ID or session ID)
The criteria for completion or passing
When the LMS imported the course, it parsed the
.au
file and built a structured course menu.
Each AU became a clickable item, and when the learner launched one, the LMS would generate a session token and send it to the content.
Behind the scenes, every launch kicked off a HACP communication loop:
The content reported progress, scores, and completion status using plain HTTP messages — the same mechanism you explored last week.
🌐 How Launching Worked
Upload and Host:
The AICC package (the flat files and content) was uploaded to a web-accessible directory.
The LMS stored a reference to the.crsfile or directly to the course URL.LMS Reads the Specs:
During import, the LMS parsed the.crs,.au,.des,.cst, and.cmpfiles.
From this, it built an internal representation of the course structure and sequencing.Launch the AU:
When a learner clicked “Start,” the LMS opened the launch URL in a new window or frame.
It passed key query parameters such as:The content then used these values for all communication back to the LMS.
Track the Session:
While the learner progressed, the content periodically sent HACP messages like:Once the learner finished, the final call marked the session as complete.
Close and Record:
The LMS confirmed the result, updated the database, and returned a simple “OK.”
No JavaScript events, no SCORM runtime APIs — just HTTP.
🧪 Testing AICC Content
Testing an AICC course was refreshingly straightforward.
Because everything relied on text files and POST requests, developers could debug with nothing more than:
A web browser
A text editor
And tools like
curlor HTTP request logs
If the LMS didn’t register a completion, you could open the
.au
file, verify the launch path, and manually replay the HACP request to confirm the data exchange.
It was an era when visibility and simplicity made troubleshooting easier than most modern e-learning stacks allow.
🧱 Why It Worked
AICC’s launch model succeeded not because it was advanced, but because it was consistent.
Every LMS and every content vendor followed the same small set of rules.
If your course obeyed the spec, it would run anywhere — whether hosted on an internal server or distributed on CD-ROM.
That’s the power of clear contracts between systems:
No proprietary dependencies
No hidden runtime behavior
Just transparent, predictable communication
💡 Developer Reflection
AICC proved that integration doesn’t need complexity — it needs consistency.
Its launch logic worked because everyone agreed on how the parts should interact.
When you design systems today, ask yourself:
Would it still function if all you had were text files and an HTTP endpoint?
Sometimes, the strongest architectures are the ones that don’t depend on anything more.
🔭 What’s Next
Next week, we’ll explore what happened when the web took over — and SCORM replaced AICC as the dominant e-learning standard.
You’ll see what SCORM improved, what it broke, and what we can still learn from the transition.
🔢 5 of 8 | AICC – The Origins of E-Learning Standards








