Full System Development & Client Projects
Grade 10 Design reaches the highest level of the MYP design cycle. You are expected to design and build complete, functional systems for real clients, rigorously evaluate every specification criterion with test evidence, and document the full design cycle in a comprehensive ePortfolio. This is the capstone of five years of MYP Design.
What You'll Learn
- Conduct rigorous needs analysis and client consultation for Criterion A
- Develop detailed, measurable design specifications and justify multiple design options (Criterion B)
- Build a complete, functional product with documented changes and client feedback (Criterion C)
- Evaluate every specification criterion against test evidence; propose specific justified improvements (Criterion D)
- Apply advanced software development concepts: SDLC, version control, agile development, accessibility (WCAG)
- Produce a comprehensive ePortfolio demonstrating the complete design cycle
ePortfolio Assessment
Criterion A — Inquire & Analyse (8 marks): Design brief, annotated research, existing solutions analysis, target audience needs.
Criterion B — Develop Ideas (8 marks): Full design specification, multiple design options with justification, chosen design explanation.
Criterion C — Create Solution (8 marks): Product evidence (screenshots/photos/video), annotated development plan, documented changes.
Criterion D — Evaluate (8 marks): Test protocol, test results, evaluation against EVERY specification criterion, specific improvements with justification.
Key Vocabulary
| Term | Definition |
|---|---|
| Design brief | A concise statement of the design problem, client needs, constraints, and context |
| Design specification | Detailed, measurable list of all criteria the final product must meet; the basis for Criterion D evaluation |
| SDLC | Software Development Lifecycle: requirements → design → implementation → testing → deployment → maintenance |
| Version control | System tracking all changes to code (e.g., Git/GitHub); enables rollback and collaboration |
| Agile development | Iterative approach building features in short cycles (sprints) with continuous testing and client feedback |
| Waterfall model | Linear sequential development phases completed in strict order; traditional approach |
| Responsive design | Web/app design that adapts layout and content to different screen sizes and devices |
| WCAG | Web Content Accessibility Guidelines: international standards for making digital content usable by people with disabilities |
| Usability testing | Having real users interact with a prototype to identify problems and evaluate effectiveness |
| Comprehensive evaluation | Evaluating every specification criterion objectively with test evidence — not just "it looks good" |
Criterion A: Inquire & Analyse
Criterion A is the foundation of the design cycle. Strong analysis informs everything that follows. At Grade 10, this means genuine client consultation, rigorous existing solutions analysis, and a clearly justified design need.
What Criterion A Requires
Design Brief
A clear, concise statement of: who the client is, what problem exists, what kind of solution is needed, and what constraints apply (time, budget, technology, audience).
Research
Annotated research that goes beyond listing facts. Identify specific design principles, user needs, and technological possibilities. Each source should be cited and evaluated for reliability.
Existing Solutions Analysis
Analyse at least 2–3 existing solutions. Identify specific strengths and weaknesses of each. Use this analysis to inform your own design specification.
Target Audience
Define your target audience specifically: age range, technical ability, context of use, device preferences. Interview or survey real users if possible.
Design Brief Example
Client: Year 11 students preparing for eAssessments; secondary users: science teachers.
Problem: Students waste significant time finding relevant past questions from multiple scattered sources. There is no filtering by topic, year, or difficulty level.
Solution needed: A web application enabling students to search and filter a database of past exam questions by subject, topic, difficulty, and year.
Constraints: Must be free to host; must work on mobile and desktop; must be navigable without technical training; must be launched within 12 weeks.
Analysing Existing Solutions
- Name and describe each solution specifically
- Identify 2–3 specific strengths (use screenshots and design annotations)
- Identify 2–3 specific weaknesses (relate to your client's needs)
- State what you will take from each solution (inspiration) and what you will do differently (improvement)
Criterion B: Develop Ideas
Criterion B is about generating, developing, and justifying design options. The design specification is the most critical document — it defines what success means for Criterion D. Multiple design options must be presented and justified.
Writing a Design Specification
• Measurable (can be objectively tested: yes/no, data threshold)
• Specific (not vague: "user-friendly" → "navigable by a novice in under 2 minutes")
• Justified (explain why this criterion matters for the client)
Specification Example (Web Application)
| # | Criterion | Test Method | Justification |
|---|---|---|---|
| 1 | Load time under 3 seconds on standard broadband | Chrome DevTools, 3 measurements | Users abandon pages loading >3s (Google research) |
| 2 | Mobile-responsive on screens 375px to 1920px | Chrome DevTools device simulation; test on 3 physical devices | 65% of users will access via mobile device |
| 3 | Filter by topic returns correct results in 100% of tests | Systematic testing with 20 test queries, checking output | Core functionality; errors undermine trust |
| 4 | Meets WCAG AA accessibility for colour contrast | WAVE accessibility checker; manual check | Inclusive design for visually impaired users |
| 5 | 8/10 new users locate target resource in under 2 minutes (usability) | Usability test with 10 students; timed with think-aloud protocol | Client requirement: navigable without training |
Multiple Design Options
- Each option must be substantially different (not just colour changes)
- Annotate each with strengths and weaknesses relative to the specification
- Justify your chosen design with reference to specific specification criteria
- Include client feedback if possible
Criterion C: Create the Solution
Criterion C assesses the creation process and the quality of the final product. Documentation of the development process, including changes made and their justification, is as important as the product itself.
What Criterion C Requires
Product Evidence
Screenshots, photos, or video clearly showing the complete, functional product. All features described in the specification must be visible.
Annotated Development Plan
Timeline with milestones; shows planned vs actual progress; demonstrates self-management and systematic approach.
Documented Changes
When you deviated from the original design, explain what changed, why, and how it improved the product. Undocumented changes score lower.
Technical Skills
Code quality (if digital product), construction quality (if physical), or composition quality (if media). Annotations demonstrate understanding, not just execution.
Advanced Development Concepts
| Concept | Definition | Why it matters in Grade 10 |
|---|---|---|
| Version control (Git) | Tracks all code changes; enables rollback to previous versions | Shows professional practice; provides evidence of development over time |
| Agile sprints | Build one feature completely (design → code → test) before moving to next | Higher quality individual features; easier to document and evaluate |
| Responsive design | CSS media queries adapt layout to screen size | Essential for modern web products; tested in specification |
| Accessibility | Colour contrast, alt-text, keyboard navigation, screen reader compatibility | Ethical design; WCAG compliance is a specification criterion |
| User testing during development | Test with real users during (not only after) development | Identifies problems earlier; more cost-effective to fix; improves final quality |
Criterion D: Evaluate
Criterion D is the most challenging and most commonly under-achieved criterion. At Grade 10, you must evaluate EVERY specification criterion objectively with test evidence, and propose specific improvements with justification. "It looks good" is not an evaluation.
The Evaluation Framework
- State the specification criterion exactly as written
- State the test method you used
- State the test results (data, not impressions)
- Evaluate: Was the criterion met? Fully? Partially? Why/why not?
- Propose specific improvement with justification of why it would work
Model Evaluation for One Criterion
Test method: Chrome DevTools Performance tab, measuring total page load time on three separate occasions on a 50Mbps connection.
Results: Load times were 2.4s, 2.7s, and 2.5s (mean: 2.53s).
Evaluation: This criterion was successfully met — all three measurements recorded load times below the 3-second threshold.
Improvement: The hero image (currently 1.2MB) is the single largest contributor to load time. Converting it to WebP format and compressing to under 200KB would reduce mean load time to approximately 1.8s, improving performance significantly — particularly for users on mobile or slower connections."
What Happens at Different Mark Bands
| Mark Band | What is Required |
|---|---|
| 1–2 | Basic description of the product; no testing data |
| 3–4 | Some evaluation against some criteria; limited testing evidence |
| 5–6 | Evaluation against most criteria with some data; improvements suggested but not justified |
| 7–8 | Evaluation of EVERY criterion with objective test data; specific improvements with technical justification; honest assessment including unmet criteria |
Worked Examples
These examples demonstrate the quality of documentation and evaluation expected at Grade 10.
Test method: I created a test protocol of 20 filter queries covering all 10 available topics (2 queries per topic). For each query, I documented the expected results and compared them with actual returned results.
Results: 18/20 queries returned correct results (90%). Two queries for the 'Forces and Motion' topic returned questions from 'Energy and Waves' due to an overlapping keyword in the database tags.
Evaluation: This criterion was partially met. The 90% accuracy is functionally useful but falls short of the 100% specification.
Improvement: The error occurred because both topics shared the keyword 'wave' in their question tags. The fix is to implement exclusive topic tags (removing 'wave' from Forces/Motion entries and creating a more precise tagging taxonomy). This is a data management fix that can be implemented without changing the filter algorithm. Estimated time: 2 hours. This would raise filter accuracy to >99%."
(b) Accessibility: "The website must achieve a minimum WCAG AA compliance rating, specifically: all text meets a minimum contrast ratio of 4.5:1 against its background (verified by WAVE accessibility checker), and all images have descriptive alt text."
(c) Usability: "At least 8 out of 10 first-time users must successfully locate and complete a specified task (e.g., 'find a past question on topic X') within 2 minutes, without any assistance, as measured by a structured usability test with a think-aloud protocol."
Agile: Iterative sprints (1–2 weeks each) where features are designed, built, and tested before moving to the next. Incorporates ongoing client feedback. Changes can be made throughout development. Risk: scope can expand; requires strong self-management.
For a school design project: Agile is typically more suitable because: requirements often evolve after client meetings; iterative testing produces higher quality; it is more realistic about uncertainty. However, a hybrid approach works well: a brief waterfall planning phase (clear requirements and design specification) followed by agile sprint-based development.
Original plan: Use static HTML pages, one per exam topic, with questions hard-coded into the HTML.
Issue identified: After creating 3 static topic pages (approximately 60 questions each), it became clear that maintaining 10 topics × 5 years of past questions would require updating 50 separate HTML files. Adding a new year's questions would require editing every page. This approach is not scalable and would make future maintenance impractical.
Change made: Switched to a JSON database file storing all questions with metadata tags (topic, year, difficulty). JavaScript fetch and filter functions dynamically generate the question list on a single HTML template page.
Impact: Required 3 additional days of development. However, the new architecture: (a) reduces page maintenance from 50 files to 1 template + 1 data file; (b) makes future data additions trivial; (c) enables the filter function (specification criterion 2). The change was essential to meet multiple specification criteria."
Client: Year 7 students (ages 11–12) at [school name]; secondary users: parents/guardians.
Context: Starting secondary school, Year 7 students encounter multiple subjects with independent homework for the first time. Many lack strategies for tracking and prioritising deadlines.
Problem: Current solutions (physical planners, notes) are frequently lost or not updated. Digital solutions (Google Calendar) are designed for adult use and too complex for this age group. Students regularly miss deadlines or feel overwhelmed due to poor visibility of upcoming tasks.
Solution needed: A simple, visually engaging mobile app allowing students to add homework tasks, set deadlines, and receive notifications. Must be intuitive for age 11–12 with no training, and not require a school account to access.
Constraints: Cross-platform (iOS/Android); no user data storage beyond device (privacy); maximum 5 main screens; completed within 10 weeks."
Strengths: Clean interface; strong cross-device sync; colour-coding by priority; recurring task support.
Weaknesses: Designed for adult productivity; requires account creation; many features irrelevant for student use (project management, filters); onboarding takes 20+ minutes.
Learning: Good colour-coding and priority system worth adopting. Account requirement not appropriate for target age group — my app will use local storage only.
Solution 2: MyHomework Student Planner
Strengths: Specifically designed for students; class/subject colour coding; good calendar view.
Weaknesses: Cluttered UI with too many options for Year 7 users; free version has persistent ads which are distracting; requires school email to set up class links.
Learning: Subject-colour coding is highly effective. I will keep the interface minimal (maximum 3 actions per screen) and avoid requiring any account creation.
Two specific WCAG AA requirements:
1. Colour contrast ratio: Normal text must have a minimum contrast ratio of 4.5:1 against its background. Large text (18pt or larger) requires 3:1. This ensures readability for people with low vision or colour vision deficiency.
2. Alternative text for images: All non-decorative images must have descriptive alt text that can be read by screen readers. This makes visual content accessible to blind or visually impaired users.
Practice Q&A
Attempt each question before revealing the model answer. Focus on precision and justification.
Agile: Iterative development in short sprints (1–4 weeks), each delivering a working feature. Continuous testing and client feedback incorporated throughout. More flexible and responsive to change but requires strong self-management and ongoing client availability.
1. Evaluation against each specific specification criterion (if there are 8, all 8 must be addressed)
2. Test evidence (data, not impressions): load times, user test results, accessibility checker outputs
3. Honest assessment: what was met, what was partially met, what was not met and why
4. Specific improvements with technical justification
"Looks good" and "teacher liked it" provide no measurable evidence and cannot be used to evaluate whether any specification criterion was met.
Problems: "user-friendly" and "easy" are subjective; no test method; no threshold for success; no specific audience defined.
Measurable version: "At least 9 out of 10 Year 7 students with no prior exposure to the app must successfully complete 3 specified tasks (add a homework task, view tomorrow's deadline, mark a task complete) within 3 minutes each, without any assistance, as measured by a structured usability test."
1. Problem-solving ability: When original plans don't work, how did you identify and resolve the issue?
2. Responsiveness to evidence: Changes based on user feedback or testing show a client-centred approach
3. Technical decision-making: Justifying a technical change shows understanding, not just execution
4. Authentic process: Real design is iterative; pretending the first design was perfect is less credible
Undocumented changes cannot earn credit. Even if the change improved the product, without documentation, the examiner has no evidence that the student made an informed, deliberate decision.
It is a specification requirement because: over 65% of global web traffic now comes from mobile devices; a non-responsive site on mobile is unusable (text too small, buttons too small to tap, horizontal scrolling required); it is considered a baseline of professional web design quality; and many clients explicitly require mobile support. Testing it (using browser device simulation and physical device testing) is straightforward and objective.
1. Informs design specification: Real strengths and weaknesses of existing products reveal what your design must do better, helping justify each specification criterion
2. Prevents reinventing the wheel: Good elements of existing designs can be adapted or improved rather than created from scratch
3. Identifies gaps: What existing solutions do NOT do well is where your design adds value to the client
4. Demonstrates research depth: Thorough analysis shows genuine engagement with the design context, not just superficial browsing
For 7–8 marks in Criterion A, analyses must be specific (name features, annotate screenshots) and connected to your own design decisions, not generic.
1. History and rollback: Every change is recorded with a timestamp and message. If a new feature breaks the product, you can revert to a previous working version in seconds
2. Collaboration: Multiple developers can work on different features simultaneously (branches) and merge changes without overwriting each other's work
3. Documentation: Commit messages create a readable history of what changed and why, invaluable for debugging and handover
4. Backup: Code hosted on platforms like GitHub is safe from local hardware failure
5. Accountability: Every change is attributed to a specific person, which matters in professional and academic integrity contexts
In a Design ePortfolio, using Git and showing commit history is evidence of sophisticated development practice that supports Criterion C mark bands.
Test method: Post-task survey administered to 10 first-time users after completing 3 specified usability tasks. Question: 'How would you rate the clarity of the interface?' (Scale: Very unclear / Unclear / Neutral / Clear / Very clear.)
Results: 7/10 users (70%) rated the interface as 'clear' or 'very clear'. Two users rated it 'neutral' and one rated it 'unclear'.
Evaluation: This criterion was not met. The 70% result falls significantly short of the 95% target. This indicates the interface has clarity problems for approximately 30% of the target audience.
Analysis of failure: Users who rated 'neutral' or 'unclear' both commented in the think-aloud protocol that the navigation menu icons were unclear — they did not understand that the house icon meant 'home' or that the bell icon meant 'reminders'.
Improvement: Adding text labels below all navigation icons (e.g., 'Home', 'Reminders', 'Add Task') would eliminate the ambiguity. Research shows icon-only navigation increases error rate by 30% for unfamiliar apps; adding labels reduces this to near zero. This change would require 1 hour of CSS modification and would likely bring the clarity rating above 95%."
Flashcard Review
Tap each card to reveal the answer. Try to answer from memory first.