Skip to content

Overview

AI Labs provides a structured and scalable model for universities to manage academic lab environments. It uses a three-level hierarchy: Stream β†’ Subject β†’ Batch, to organize academic programs, allocate GPU/CPU resources, and ensure controlled access for students and faculty.

Before this hierarchy is created, the Organization Administrator procures the required Stock Keeping Units (SKUs) through the AI Lab Subscription. These SKUs defined in the environment template represent the lab environments available for provisioning (for example, notebook instances, VS Code servers, or high-memory compute nodes).

Note:

  • SKU procurement happens at the Organization level.
  • Procured SKUs flow into Streams β†’ Subjects β†’ Batches and are reused across both models described below.

Understanding the HierarchyΒΆ

AI Labs organizes academic programs into three clear layers:

  1. Stream (First Level): A Stream represents a major academic specialization. Examples: Computer Science, Cyber Security, Data Science. Each stream becomes a dedicated workspace where its subjects will be created.
  2. Subject (Second Level): A Subject represents an individual course or module offered within a stream. Examples: Machine Learning, Deep Learning, Data Analytics. Subjects define the lab administrators and enforce subject-level quotas.
  3. Batch (Third Level): A Batch is a time-bound group of students enrolled in a subject. Examples: ML Morning Batch, DL Evening Batch, Weekend Batch. Batches connect students to the lab resources they can access.

πŸ“˜ Example: How a Real University Maps to this Hierarchy

University XYZ
 └── Stream: Computer Science
       β”œβ”€β”€ Subject: Machine Learning
       β”‚      β”œβ”€β”€ Batch: ML – Morning (9–11am)
       β”‚      └── Batch: ML – Evening (3–5pm)
       └── Subject: Deep Learning
              └── Batch: DL – Weekend Batch

This example shows how academic structures align directly with the Stream β†’ Subject β†’ Batch hierarchy.


Why Two Administration Models?

Different universities follow different operational patterns:

  • Some prefer central IT control, where everything is configured by a single administrator.
  • Others prefer department-level autonomy, where professors manage their own courses.

AI Labs supports both through two administrative models.


πŸ”Ή Model 1: Centralized AdministrationΒΆ

In this model, the Org Admin performs almost all setup.

  • Org Admin procures SKUs
  • Org Admin creates Streams
  • Org Admin creates Subjects
  • Org Admin creates Batches
  • Lab Admins are assigned only to support students (not to create Subjects or Batches)

Typical Characteristics

  • Simplest administrative model
  • Maximum governance and control
  • Lab Admins function as teachers/helpers only
  • No department-level creation of Subjects or Batches

This model is ideal for universities that want a central IT-managed workflow.

flowchart TB

%% Top Level
subgraph ORG[University Admin]
    UA[University Admin<br>Quota: 100 GPUs]
    OA[Org Admin]
end

UA --> OA
OA --> STCR[Create Stream]

%% Stream Level
subgraph STREAMS[Streams]
    DS[Data Science Stream]
    ML[Machine Learning Stream]
    DL[Deep Learning Stream]
    DE[Data Engineering Stream]
end

STCR --> DS
STCR --> ML
STCR --> DL
STCR --> DE

%% Subject Level
subgraph SUBJECTS[Subjects]
    SUBJCR[Create Subject<br>Quota: 10 GPUs]
    MLSUBJ[ML Subject]
    DLSUBJ[DL Subject]
end

DS --> SUBJCR
SUBJCR --> MLSUBJ
SUBJCR --> DLSUBJ

%% Batch Level
subgraph BATCH[Batch Admin]
    BATCHCR[Create Batches]
    MB[Morning Batch]
    EB[Evening Batch]
end

MLSUBJ --> BATCHCR
BATCHCR --> MB
BATCHCR --> EB

%% Students
subgraph STUDENTS[Students]
    S1[Notebook Student<br>Quota:1 GPU]
    S2[Inference Student<br>Quota:1 GPU]
    S3[VM Student<br>Quota:1 GPU]
end

MB --> S1
EB --> S2
EB --> S3

πŸ”Ή Model 2: Delegated AdministrationΒΆ

Responsibilities are distributed across multiple roles.

  • Org Admin creates Streams and assigns Stream Admins
  • Stream Admins create Subjects and assign Lab Admins
  • Lab Admins create Batches and add students
  • Students access only the resources linked to their batch

Typical Characteristics

  • Departments manage their own courses
  • Lab Admins actively create batches
  • Org Admin oversees only SKU pool and top-level setup
  • Enables decentralized autonomy for academic departments

This model is suitable for universities that prefer department-level administrative control.

flowchart TB

%% Top Level
subgraph ORG[University Admin]
    UA[University Admin<br>Quota: 100 GPUs]
    OA[Org Admin]
end

UA --> OA
OA --> STCR[Create Stream]

%% Stream Level
subgraph STREAMS[Streams]
    SA[Stream Admin<br>Quota: 40 GPUs]
    DS[Data Science]
    ML[Machine Learning]
    DL[Deep Learning]
    DE[Data Engineering]
end

STCR --> SA
SA --> DS
DS --> ML
DS --> DL
DS --> DE

%% Subject Level
subgraph SUBJECTS[Subjects]
    LA[Lab Assistant<br>Quota: 10 GPUs]
    SUBJ[Create Subject]
end

ML --> LA
LA --> SUBJ

%% Batch Level
subgraph BATCH[Batch Level]
    BA[Batch Admin<br>Quota: 5 GPUs]
    BATCHES[Create Batches]
    MB[Monday Batch]
    TB[Tuesday Batch]
end

SUBJ --> BA
BA --> BATCHES
BATCHES --> MB
BATCHES --> TB

%% Students
subgraph STUDENTS[Students]
    S1[Notebook Student<br>Quota: 1 GPU]
    S2[Inference Student<br>Quota: 1 GPU]
    S3[VM Student<br>Quota: 1 GPU]
end

MB --> S1
TB --> S2
TB --> S3
πŸ”” Important β€” Reference Use Case

βœ”οΈ Notebook SKUs are shown only as an example. Customers may choose to provision any SKU types supported by their environment template and organizational requirements.

The SKU Procurement β†’ Stream Creation β†’ Subject Creation β†’ Batch Creation β†’ Student Access flow shown in this documentation represents a sample university use case built using AI Labs.

This reference flow is based on the Model 2: Delegated Administration approach, where responsibilities are distributed across Org Admins, Stream Admins, and Lab Admins.

While profile names, navigation labels, and SKU types can be customized per customer, the core workflow and hierarchy remain consistent across implementations.