Cisco Packet Tracer
How AI Cut Student Frustration in Half

Current Workflow
Offer weekly tutorial videos and hold office hours to help students complete their networking labs in Cisco Packet Tracer.
Challenge
Students often got stuck. One small config error or file glitch could break the whole network.
Impact
→ An AI-powered assistant increased students' confidence, and revealed a path to reducing staff workload.
01 Overview
While serving as a TA for three semesters in a networking course, I noticed a consistent pattern: students didn’t struggle with the networking concepts themselves — they struggled with the tool, Cisco Packet Tracer. Despite weekly tutorials and regular office hours, many students hit frustrating dead ends, often compounded by their unfamiliarity with both the interface and the foundational concepts of networking. These weren’t just usability issues — they were barriers to learning, costing students valuable time and requiring instructors to provide repeated support. I asked myself:
What if Packet Tracer wasn’t just a simulator, but an AI-augmented learning tool — one that could guide students through mistakes in real time?
Duration
Jan 2024 – May 2024 (3 weeks,
Concept design developed over 3 weeks in early 2024)
02 Challenge
Cisco Packet Tracer is a robust simulator, but its design creates friction for first-time users and self-learners.
Key Pain Points
Student Drop-Off Risk
Frustration led to disengagement, risking lower completion rates in foundational courses.
High Support Dependence
TAs spent significant time addressing avoidable tool issues.
Brand Experience
A clunky tool experience risks poor perception of Cisco's learning tools.
03 Approach
I focused on reducing the cognitive load of troubleshooting itself — by designing an assistant that understood students’ intent and responded with structured, task-specific help.
I observed where learning broke down
“Why isn’t this port/router working?”
“I don’t know why this is not working.”
“What does this command do?”
“Why can’t this device reach that one?”
I started by observing real behaviors and friction points during lab sessions. I noticed that students didn’t all get stuck in the same way — they encountered different types of obstacles, ranging from conceptual confusion to procedural missteps. Some didn’t understand networking logic, others struggled to interpret the state of a specific device, while many couldn’t trace where their communication flow was breaking down.
I mapped real student questions into intent patterns

Then I flipped the model: let students clarify their problem through interaction
Instead of writing a perfect prompt, students used small, intuitive actions — like clicking on a device or selecting two endpoints — to signal what kind of help they needed. These interaction patterns worked like debug types —
guiding the AI to respond with the right kind of support.
General
Ask conceptual or CLI-related questions (e.g., IPv4 vs. IPv6).
• No topology selection
• Natural language question
• Explanation + CLI sample
Select Device/Link
Click a specific device or port for configuration-level help.
• Detect misconfigurations
• Device-specific CLI fixes
• Explain common errors
Select Connection
Choose two devices to trace a failed flow (e.g., ping failure).
• Trace path
• Locate drop point
• Suggest next fix
Select Area
Loop a section to scan the zone for logic or IP issues.
• Pattern scan
• Top issue list
• Prioritize student attention
04 Solution
AI assistant categorized student issues into task types and offered real-time, intent-driven help without needing perfect prompts.
Each session generated a concept review log, organizing all AI interactions into a searchable, reusable learning history.
05 Impact
While this was a concept sprint, I tested the prototype in student interviews and received strong validation:
• 8 of 9 students said they would use the assistant instead of coming to office hours for common issues.
• 6 students reported feeling more confident exploring CLI tasks with AI safety net.
• Course staff expressed interest in using it to reduce support load.
06 Reflection
Inspired by how people use AI today — not for open-ended chat, but to get specific tasks done — I designed this assistant to be action-oriented and embedded in context to help users get unstuck and move forward in their actual workflow.
AI should be Context-aware and Task-specific
Students already turned to ChatGPT when they were stuck. But GPT couldn’t “see” their Packet Tracer lab. It gave generic answers, not tailored support — and students needed to write perfect prompts to get anything useful. Most couldn’t.
From Commands to Intent-Driven Design
Students don't want to memorize CLI commands — they want to express their intent. That insight shifted my design principle: from command-driven interaction to intent-based guidance. The AI became a mentor — walking them through networking as a process, building skill and confidence step by step.