Week 2
Implicit Interactions
History, theory, affordances, calm computing. AprilTags as physical-digital links. Your first smart room.
Slides
Demos
Interaction Spectrum
Explore the range from explicit to implicit
Explicit Control
User initiates every action deliberately
Implicit Response
Room responds to position automatically
Interaction Designer
Build a product, see its quadrant and values
Affordances Explorer
Physical vs computational affordances
Calm Tech Test
Weiser's principles as a checklist
Seven Values
Victor's communal computing principles
AI: Agent vs Material
14 scenarios comparing two visions of AI
Room Simulator
Drag objects around, watch rules trigger
Live Room View
Real-time visualization of objects from server
Raw Data Stream
JSON output from the camera
Labs
01
Ambient Light
Change room lighting based on tag position
02
Zone Response
Define zones, trigger actions on entry
03
Simple Rules
JSON-driven rule engine
04
WebSocket Listener
Connect to Python server stream
05
AprilTag Detector
Real camera tag detection visualization
06
Sensor Simulator
Build custom simulated sensors
Running the server: See server setup guide for venv instructions
Week 3
Seeing the Room
OAK-D cameras, SSH into Raspberry Pis, person detection with YOLO, building notification pipelines.
Slides
Demos
Person Detection Sim
Simulated YOLO bounding boxes with confidence slider
SSH Journey
Animated SSH connection explainer
Pipeline Builder
Drag-and-drop DepthAI pipeline assembly
Detection Timeline
Debounce and temporal smoothing visualizer
Discord Flow
Webhook notification flow animation
Cameras vs Tags
Side-by-side comparison of both approaches
Week 4
The Invisible Interface
AI as a design material. What models are, how they work, and running them yourself on your own machine.
Demos
AI Terminology
Interactive taxonomy: AI, ML, deep learning, LLMs, agents
ML Training
Train a neural net on spatial scenarios — fatigue, attention, room modes
Model Explorer
AI model sizes, RAM requirements, and what runs where
Smart Object Studio
Transform everyday objects into smart objects — affordances, sensors, intelligence
Interaction Patterns
Four ways a smart room responds — presence, context, occupancy, objects
AI as a Design Material
Map model properties to material metaphors — size, temperature, context, specialization
Week 6
Conversational Machines
Machine perception and voice prototyping — all browser-based, no hardware needed.
Research Module
Interaction Experiments
A small module for responsive text, adaptive hierarchy, public room voice, and the ethics of how a sensed space speaks back.
Start Here
Interaction Experiments Hub
Week plan, framing notes, and links to the live prototypes
Calm Signifier
One sensed classroom state, three public voices: calm, helpful, and creepy
Adaptive Reading Field
Stable navigation, adaptive hierarchy, Pretext-driven type fields, and deeper-learning support
Lecture: Interaction Experiments
A controller + audience deck explaining Pretext, public signifiers, adaptive hierarchy, and culturally aware interface paradigms
Week 7
Interactive Spaces
Projection mapping, camera-based touch detection, voice commands, and AI — all browser-based. The repo is now organized around a simple starter and Surface Studio.
All Content
Interactive Spaces Hub
Curated hub with the simple starter, Surface Studio, docs, and archived prototypes
Lecture: Interactive Spaces
Homography, BroadcastChannel, MediaPipe hand/face tracking, voice, actions
Lecture: Surface Studio
Full app walkthrough, projection workflow, widgets, actions, and deployment-ready setup
Examples
Reference
Tools
Running slides locally:
cd slides && npm install && npm run dev
Documentation