SE Logo

Final Year Design Project (FYDP) Portal
Department of Software Engineering

Coordinator Photo
Dr. Natalia Chaudhry
Coordinator, FYDP
fydp.dse@pucit.edu.pk

FYDP-DSE Milestones & Flow

FYDP-DSE Process Flow

FYDP-DSE process flow overview

Guidelines & Templates

Group Registration
  • Students are welcome to form interdisciplinary groups, including members from other departments (like Department of Computer Science, Department of Information Technology, and Department of Data Science). Students are expected to follow the Department of Software Engineering’s rules, evaluation criteria, and submission deadlines throughout the project.
  • Cross-campus grouping is permitted within the university network, provided the evaluations and milestone submissions are aligned with the Department of Software Engineering’s calendar.
  • Each group must consist of up to 4 students.
  • Group composition must be finalized and submitted by the deadline via the FYDP-DSE Group Registration Form (shared by FYDP-DSE coordinator via email).
  • Any queries related to FYDP-DSE group formation and any activity related to FYDP-DSE must be shared via email to FYDP-DSE coordinator (FYDP-DSE.dse@pucit.edu.pk).
  • All group members share equal responsibility for project progress, submissions, and evaluation preparedness.
  • Disputes or member withdrawal must be immediately reported to the FYDP-DSE office.
  • Deadline to register the groups via Google form is

    28th July, 2025

    .
Group Registration Guidelines
Group Registration form link
Supervisor Registration
  • Students may choose to work with a supervisor from any department, such as the Department of Software Engineering, Computer Science, Information Technology, or Data Science. They can also register a supervisor from industry, based on the nature and needs of their project.
  • Students are expected to follow the Department of Software Engineering’s rules, evaluation criteria, and submission deadlines throughout the project.
  • Students are required to fill out the Supervisor Registration Form (provided by the FYDP-DSE) office via email, before the deadline, i.e.

    15 Aug, 2025.

  • Late submissions will not be accepted.
Supervisor Registration Guidelines
Supervisor Registration Form
Proposal submission
  • Each FYDP group is required to prepare a comprehensive and realistic project proposal using the prescribed proposal template (FYDP-DSE_005_Proposal template.pdf). The proposal should be developed under the close supervision of the assigned advisor(s) and must reflect the group’s understanding of the problem, its commercial relevance, proposed solution, development roadmap, and final deliverables, including deployment plans and product documentation (e.g., user manual, installation guide, and technical specs).
  • All proposals must be formally submitted and uploaded using google form link shared by FYDP-DSE Coordinator via email by

    28th Aug, 2025.

  • Future evaluation will be based on problem statement clarity, innovation, feasibility, technical scope, and potential for productization.
Proposal submissio Guidelines
Proposal submission link (yet to open)
Proposal design template
Proposal evaluation
  • Evaluation of FYDP students will be carried out on the basis of criteria specified on proposal evaluation for (Form4_Proposal evaluation form.pdf).
  • A dedicated Faculty Advisory Committee (FAC) will be assigned to each group to conduct the proposal evaluation.
  • It is the responsibility of each group to coordinate with the assigned FAC—details of which will be shared by the FYDP Coordinator via email—for proposal evaluation.
  • The following parameters will be applied in assessing submitted proposals, as reflected in (Form4_Proposal_Evaluation_Form.pdf):
    • Clarity of Problem Statement – Evaluation of how clearly the proposal defines the problem, its context, and its significance, ensuring that the objectives are well-articulated and easily understood.
    • Innovation and Novelty – of the originality of the proposed approach, including creative concepts, unique methodologies, or groundbreaking ideas that distinguish it from existing solutions.
    • Feasibility (Time and Resource Considerations) – Examination of the practicality of executing the proposal within the specified timeframe and with the available resources, including personnel, funding, and infrastructure.
    • Scope – Review of the breadth and depth of the technical aspects addressed, the robustness of the methodology, and alignment with current technological standards, as well as potential for scalability.
    • Potential for Productization – Consideration of the likelihood that the proposed solution can be transformed into a viable product or service, with attention to market applicability, commercialization prospects, and long-term sustainability.
  • The Faculty Advisory Committee will review and evaluate each proposal based on technical feasibility, innovation, market relevance, and completeness using form (Form4_Proposal evaluation form.pdf). The Committee may:
    • Accept the proposal as-is
    • Reject the proposal
    • Accept the proposal with mandatory revisions
  • In case of suggested revisions, groups must resubmit the updated proposal for final approval. Once a proposal is formally accepted, no major changes will be allowed without written justification and approval from the Committee.
  • This proposal process ensures that all FYDPs are well-planned, commercially viable, and technically sound before development begins.
  • The weightage of proposal evaluation in the final cumulative score is 10%.
  • Deadline to submit the proposal evaluation results via form (Form4_Proposal evaluation form.pdf) is

    5th September, 2025.

Proposal evaluation Guidelines
Proposal evaluation form
D1 submission and evaluation
  • All groups must prepare D1 (under the guidance of respective supervisor(s)) using the official templates provided by the FYDP-DSE office (FYDP-DSE_007_A_D1_template.pdf) and (FYDP-DSE_007_W_D1_template.pdf) for Agile and Waterfall methodology, respectively.
  • Groups shall get their D1 evaluated internally by their supervisor(s).
  • The following parameters will be used for evaluating projects following the Agile (Scrum) methodology. Each criterion reflects key aspects of Agile planning, organization, and delivery quality.
    • Clarity of Product Vision: Assesses how clearly the team has articulated the overall product goal, its purpose, and the value it delivers to end users. The vision should be concise, inspiring, and provide a guiding direction for all development activities.
    • Defined Epics: Evaluates whether large, high-level features or initiatives (Epics) are well-identified, logically grouped, and aligned with the product vision. Epics should provide a structured roadmap for subsequent story breakdowns.
    • Well-Structured Product Backlog (Stories): Reviews the organization, clarity, and completeness of the Product Backlog. User stories should be well-defined, follow a consistent format (e.g., “As a… I want… so that…”), and capture all necessary requirements for implementation.
    • Use of Priority Tags / Numeric Prioritization: Checks whether backlog items are assigned clear priorities using tags (e.g., High/Medium/Low) or numeric values (e.g., MoSCoW, Fibonacci sequence) to ensure the most valuable features are delivered first.
    • Definition of Done (DoD): Evaluates the clarity and completeness of the agreed criteria that determine when a backlog item is considered complete. A strong DoD should include quality standards, testing requirements, and documentation expectations to ensure consistent delivery.
    • Git Setup & Deployment Compliance: Evaluates the readiness of the version control environment. The repository should be properly structured, with initial commits reflecting organized setup (folders, boilerplate code, documentation) that supports collaborative development.
  • The following parameters will be used for evaluating projects following the Waterfall development methodology. Each criterion corresponds to a key phase or deliverable in the sequential project execution process.
    • Problem Statement: Evaluates the clarity and completeness of the problem definition, including its context, scope, and significance. The problem should be well-articulated to form a solid foundation for the requirements phase.
    • Functional & Non-Functional Requirements: Assesses whether the requirements are complete, unambiguous, and well-documented. Functional requirements should describe system behaviors, while non-functional requirements should address performance, security, usability, and other quality attributes.
    • Technology Stack Justification: Reviews the rationale behind selecting specific programming languages, frameworks, databases, and tools. The justification should align with the project’s requirements, constraints, and scalability considerations.
    • Lightweight Design Diagrams (DFD/UML) or UI/UX Considerations: Checks whether the design phase outputs are clear, relevant, and easy to interpret. This may include Data Flow Diagrams, UML diagrams, or UI/UX mockups to illustrate the system’s architecture, workflow, and user interface concepts.
    • Git Repository Setup & Initial Commits: Evaluates the readiness of the version control environment. The repository should be properly structured, with initial commits reflecting organized setup (folders, boilerplate code, documentation) that supports collaborative development.
  • Form5_A_D1.pdf and Form5_W_D1.pdf will be used for evaluation. After evaluation, supervisor(s) will submit the evaluation form to FYDP-DSE office.
  • All projects must maintain a properly structured Git repository, with a designated main branch and appropriate feature branches for development. The project supervisor(s) must be added as an administrator or collaborator with full access. All subsequent code, documentation, and deployments must be committed and pushed to this repository throughout the project lifecycle. Compliance with proper Git setup, usage, and deployment will be evaluated as part of the D1 assessment.
  • Each group must submit both hard copy (signed by supervisor) and soft copy (uploaded to Google form shared via email by FYDP-DSE Coordinator) before the deadline,

    10th Oct, 2025.

  • The weightage of D1 in the final cumulative score is 30%.
  • Late submissions will be penalized or rejected unless prior approval is obtained.
Submission and evaluation Guidelines
D1 submission link (yet to open)
D1 Design templates (Agile and Waterfall)
Evaluation forms (Agile and Waterfall)
D2 submission and evaluation
  • All groups must prepare D2 (under the guidance of respective supervisor(s)) using the official templates provided by the FYDP-DSE office (FYDP-DSE_009_A_D2_template.pdf) and (FYDP-DSE_009_W_D2_template.pdf) for Agile and Waterfall methodology, respectively.
  • Teams must maintain feature branches, proper commit history, and complete README.md documentation.
  • Screenshots or exports should be uploaded as part of the submission for verification.
  • Failure to follow repository structure and documentation requirements may lead to deduction of marks.
  • Each group must submit both hard copy (signed by supervisor) and soft copy (uploaded to Google form shared via email by FYDP-DSE Coordinator) before the deadline,

    19th Dec, 2025.

  • The weightage of D2 in the final cumulative score is 60%.
  • Late submissions will be penalized or rejected unless prior approval is obtained.
  • Form6_A_D2.pdf and Form6_W_D2.pdf will be used by supervisor(s) for evaluation.
  • The following parameters will be used for evaluating projects during the D2 stage for teams following the Agile (Scrum) methodology. These criteria assess the team’s delivery, adherence to Agile practices, and quality of output at this stage.
    • Summary Clarity: Evaluates the completeness and clarity of the sprint summary, including key activities, completed work, pending tasks, and any challenges faced. The summary should give a clear picture of progress and outcomes for the sprint(s) under review.
    • Completed Stories with Acceptance Criteria: Assesses whether the committed user stories have been fully implemented and meet the predefined acceptance criteria. Stories should be demonstrably complete and aligned with stakeholder expectations.
    • Definition of Done (DoD) Verification: Reviews whether the completed work adheres to the team’s agreed Definition of Done, including coding standards, testing coverage, documentation, and any other quality measures.
    • Working Build / Demo Readiness: Checks if the current product increment is deployable and ready for demonstration without major defects or blockers. The build should run smoothly in the intended environment.
    • Final GitHub Code Quality: Examines the organization, readability, and maintainability of the code in the GitHub repository. This includes proper commit messages, folder structure, use of branches, and absence of unnecessary or redundant files.
  • The following parameters will be used for evaluating projects during the D2 stage for teams following the Waterfall development methodology. These criteria assess the team’s delivery of the initial functional build, documentation, and code readiness at this stage.
    • Functional Prototype (Core Module): Evaluates whether the core functionality of the system is implemented and operational as per the design specifications. The prototype should demonstrate key workflows and serve as a foundation for further development.
    • README File Quality: Reviews the clarity, completeness, and usefulness of the README file in the GitHub repository. It should provide setup instructions, system requirements, usage guidelines, and any relevant project notes for new contributors or evaluators.
    • GitHub Repository Setup: Assesses whether the repository is well-structured, includes proper branching, and contains relevant files. The repository should reflect ongoing development progress and be accessible to the supervisor with full permissions.
    • Initial CI/CD Plan Mentioned (Even if Manual): Checks whether the team has outlined an initial Continuous Integration/Continuous Deployment plan, even if implementation is manual at this stage. This includes describing how builds will be tested, packaged, and deployed.
    • Code Quality & Organization: Examines the structure, readability, and maintainability of the source code. This includes proper naming conventions, modularity, commenting, and removal of unused or redundant files.
    Submission and evaluation Guidelines
    D2 submission link (yet to open)
    D2 Design templates (Agile and Waterfall)
    Evaluation forms (Agile and Waterfall)
D3 submission and evaluation
  • All groups must prepare D3 (under the guidance of respective supervisor(s)) using the official templates provided by the FYDP-DSE office (FYDP-DSE_011_A_D3_template.pdf) and (FYDP-DSE_011_W_D3_template.pdf) for Agile and Waterfall methodology, respectively.
  • Complete the final coding, testing, and integration.
  • Prepare the deployment manual or scripts and update the README with final usage instructions.
  • Ensure the GitHub repository is clean and properly structured.
  • Each group must submit both hard copy (signed by supervisor) and soft copy (uploaded to the department portal/email) before the

    deadline, (Will be notified soon).

  • Late submissions will be penalized or rejected unless prior approval is obtained.
  • Form7_A_D3.pdf and Form7_W_D3.pdf will be used by supervisor(s) and FAC for evaluation.
  • The following parameters will be used for evaluating projects during the D3 stage for teams following the Agile (Scrum) methodology. These criteria focus on the completeness of deliverables, final implementation quality, and readiness for deployment or demonstration.
    • Sprint Summary Report: Evaluates the completeness and clarity of the final sprint report, covering the work completed in the last iteration(s), unresolved tasks, retrospective insights, and overall project progress.
    • Completed Stories & Acceptance Criteria: Assesses whether all committed user stories for the release have been implemented and verified against their predefined acceptance criteria. Stories should be production-ready and meet stakeholder requirements.
    • Final Definition of Done (DoD) Verification: Reviews whether the final product increment fully complies with the agreed Definition of Done, including quality standards, testing, documentation, and deployment readiness.
    • GitHub Repository Quality: Examines the final state of the GitHub repository in terms of structure, commit history, branching strategy, documentation, and accessibility. The repository should be clean, organized, and reflective of best practices.
    • Working Demo (Live / Video / Screenshare): Assesses the functionality and usability of the working product as demonstrated through a live session, recorded video, or screenshare. The demo should showcase key features without critical errors or downtime.
    • Deployment Documentation: Evaluates the completeness and clarity of the documentation describing how to deploy and run the product. This should include prerequisites, installation steps, configuration instructions, and any environment-specific details.
    • Executable or Packaging Outcome: Reviews whether the project deliverable is available in a packaged format (e.g., installer, Docker image, zipped distribution) or as an executable that can be deployed and tested without additional setup complexities.
  • The following parameters will be used for evaluating projects during the D3 stage for teams following the Waterfall development methodology. These criteria focus on the completeness of the final deliverables, deployment readiness, and end-user support documentation.
    • Final Source Code Link (GitHub): Evaluates whether the complete and final version of the source code is available in the designated GitHub repository. The repository should reflect the finalized state of the project, be well-organized, and accessible to the supervisor.
    • README with Usage Instructions: Reviews the clarity and thoroughness of the README file in providing instructions for installing, running, and using the system. It should include prerequisites, setup commands, and basic troubleshooting tips.
    • Setup Manual (Cross-Platform): Assesses the availability and completeness of a setup guide that works across intended platforms (e.g., Windows, macOS, Linux, Android, iOS). This manual should help new users configure and run the application successfully.
    • Deployment Steps (Localhost / Server / Mobile): Evaluates the documented steps for deploying the system in various environments such as local machines, servers, or mobile devices. The steps should be sequential, tested, and easy to follow.
    • User / Admin Manual: Reviews the completeness and clarity of the manuals intended for end-users and administrators. The manuals should explain key system features, operational workflows, permissions, and administrative controls in an easy-to-understand format.
  • The weightage of D3 in the final cumulative score is 30%.
Submission and evaluation Guidelines
D3 submission link (yet to open)
D3 Design templates (Agile and Waterfall)
Evaluation forms (Agile and Waterfall)
D4 submission and evaluation
  • All groups must prepare D4 (under the guidance of respective supervisor(s)) using the official templates provided by the FYDP-DSE office (FYDP-DSE_012_A_D4_template.pdf) and (FYDP-DSE_012_W_D4_template.pdf) for Agile and Waterfall methodology, respectively.
  • Teams must ensure the project is fully deployed and accessible for demonstration.
  • The final deliverables must be complete, clean, and tested on a fresh environment.
  • Complete documentation (installation, user manual, and configuration details) must be submitted along with the deployed product.
  • Each group must submit both hard copy (signed by supervisor) and soft copy (uploaded to the department portal/email) before the

    deadline, (Will be notified soon).

  • Late submissions will be penalized or rejected unless prior approval is obtained.
  • Form11_A_D4.pdf and Form11_W_D4.pdf will be used by supervisor(s) and FAC for evaluation.
  • The following parameters will be used for evaluating projects during the D4 stage for teams following the Agile (Scrum) methodology. These criteria focus on the final deployment readiness, code quality, demonstration, and comprehensive documentation.
    • Deployment Setup Completeness: Evaluates whether the final deployment is fully operational in the intended environment(s) with all configurations, dependencies, and integrations in place. The setup should allow stakeholders to run the system without additional troubleshooting.
    • Codebase Quality & Cleanliness: Assesses the maintainability, readability, and organization of the final codebase. This includes consistent coding conventions, removal of unused code or files, appropriate commenting, and logical folder structures.
    • Demo Video: Reviews the quality and completeness of the recorded demonstration video showcasing the final product. The video should cover all major features, workflows, and unique aspects of the solution in a clear and concise manner.
    • Documentation: Evaluates the completeness and clarity of all final documentation, including README files, deployment instructions, user guides, API documentation (if applicable), and any additional project-related materials necessary for long-term use or maintenance.
  • The following parameters will be used for evaluating projects during the D4 stage for teams following the Waterfall development methodology. These criteria focus on the completeness of final deliverables, deployment readiness, and user support documentation.
    • Final Source Code Quality & GitHub Structure: Evaluates the maintainability, readability, and organization of the final source code. The GitHub repository should follow a logical structure, use consistent naming conventions, contain relevant commit history, and exclude unnecessary files.
    • README with Usage Instructions: Reviews the clarity and completeness of the README file, ensuring it provides installation steps, system requirements, execution instructions, and troubleshooting tips for end-users.
    • Setup Manual & Platform Deployment (Cross-Platform): Assesses whether the setup manual supports installation and deployment across all intended platforms (e.g., Windows, macOS, Linux, mobile). The instructions should be tested, accurate, and easy to follow.
    • Executable Build / Packaging or Deployment Artifact: Checks the availability and usability of the final product in a packaged or executable form (e.g., installer, Docker image, APK, zipped distribution). The deliverable should be ready for immediate deployment without additional modifications.
    • o User / Admin Manual: Evaluates the completeness and usability of manuals for both end-users and system administrators. These manuals should cover feature explanations, workflows, permissions, and system management in a clear, accessible format.
    • The weightage of D4 in the final cumulative score is 50%.
    Submission and evaluation Guidelines
    D4 submission link (yet to open)
    D4 Design templates (Agile and Waterfall)
    Evaluation forms (Agile and Waterfall)
Final evaluation
  • The Final Evaluation will take place only after successful D4 submission and deployment verification.
  • To ensure that FYDPs meet academic rigor and industry standards, external evaluators—including industry professionals and academic experts—will also be involved in the review and final evaluation process during the later stages of the project. Their input will help assess the innovation, applicability, and market-readiness of the product, ensuring that the outcomes align with current industry expectations.
  • Final documentation should be prepared using prescribed template (FYDPDSE_015_Template_A_v1.pdf, FYDP-DSE_015_Template_W_v1.pdf).
  • The final evaluation of the FYDP will be conducted at the end of the academic year or semester, serving as a comprehensive assessment of the students' work. The evaluation panel will consist of one external evaluator—from industry or academia—approved by the SE FYDP Committee, along with FAC teams, and supervisor(s).
  • Evaluation will be carried out using predefined rubrics (Form13_Final Assessment.pdf). During the final evaluation, students are required to present a fully functional project demo, a project poster, a technical presentation, and a comprehensive project report, reflecting the technical depth, commercial relevance, and deployment readiness of their solution.
  • The following parameters will be used for evaluating projects during the final assessment stage. These criteria focus on the overall technical quality, completeness, professionalism, and readiness of the solution for real-world application.
    • Technical Depth & Correctness: Assesses the complexity, accuracy, and appropriateness of the technical approach used in the project. This includes correctness of algorithms, adherence to technical standards, and depth of subject matter knowledge.
    • Software Design & Architecture: Evaluates the robustness, scalability, and clarity of the system’s design. This includes architecture diagrams, modular structure, separation of concerns, and adherence to design principles.
    • Implementation Quality: Reviews the quality of the codebase, including maintainability, readability, adherence to coding standards, and effective use of chosen technologies.
    • Verification & Validation (Testing): Examines the thoroughness of testing activities, including unit testing, integration testing, and system testing. Also considers bug fixing, test documentation, and evidence of quality assurance processes.
    • Deployment & Reliability: Evaluates whether the system has been successfully deployed in the intended environment(s) and demonstrates stability, error handling, and performance under expected usage conditions.
    • Innovation & Market Readiness: Assesses the uniqueness of the solution and its readiness for commercialization or real-world adoption. This includes competitive advantage, potential user base, and alignment with market needs.
    • Documentation & Presentation: Reviews the clarity, completeness, and professionalism of all project documentation, as well as the effectiveness of the final presentation in communicating the project’s goals, process, and outcomes.
    • Professional Practice & Ethics: Considers whether the team followed ethical guidelines, maintained professional conduct, respected intellectual property, and addressed issues like data privacy, accessibility, and inclusivity.
    • Project Management Evidence: Evaluates the team’s project management practices, including timelines, task allocation, progress tracking, and adaptation to challenges. Evidence may include sprint logs, Gantt charts, meeting notes, or version control history.
  • The final evaluation carries the following weightage: D3 – 30%, D4 – 50%, and External Evaluator Score – 20%.
  • Deadline will be communicated soon.

Submission and evaluation Guidelines
Final Documentation Design templates (Agile and Waterfall)
Evaluation form (Agile and Waterfall)