Checkpoint 2: Software Development Life Cycle (SDLC) Analysis

Checkpoint 2: Software Development Life Cycle (SDLC) Analysis#

(35% of Project Grade)

Building on your project selection and initial analysis from Checkpoint 1, this checkpoint focuses on examining your chosen open-source project’s development processes. By analyzing its Software Development Life Cycle (SDLC) practices, you will gain a deeper understanding of how the project transitions from initial ideas to released and maintained software. This checkpoint will help you recognize the project’s engineering maturity, quality control measures, and collaboration patterns—essential insights for contributing effectively to open-source communities.

In this checkpoint, you will create a concise, well-organized report (approximately 2–3 pages, excluding images) that details your inferred understanding of the project’s SDLC practices. Back up your inferences with specific evidence, including direct links to repository items and/or screenshots of relevant discussions, processes, or artifacts.

Key Focus Areas:

  1. SDLC Model or Practices Inferred

  2. Justification with Specific Evidence (Links/Screenshots)

  3. Analysis of SDLC Strengths and Weaknesses

  4. Reflection and Summary Linking to Software Engineering Principles

  5. Implications for Your Future Contributions


Deliverables#

  1. Infer the SDLC Model or Practices: Determine which SDLC approach your project most closely resembles (e.g., Agile-like iteration, Waterfall, or a hybrid), even if not explicitly stated. Some key questions to consider:

    • How are requirements or feature requests initiated (issue trackers, mailing lists, forums)?

    • Is there a clear design and planning phase (design documents, architecture diagrams, PR discussions)?

    • What implementation and coding standards exist (style guides, code reviews, linters, automated checks)?

    • How is testing and quality assurance handled (unit/integration tests, CI pipelines, bug trackers)?

    • What does release management look like (versioning strategy, release notes, release frequency)?

    • How is maintenance and support organized (bug triage, security patches, LTS branches)?

  2. Justify Your SDLC Inference: Provide concrete evidence to support your claims. Evidence requirements include:

    • Direct links to issues, pull requests, commits, or documentation.

    • Screenshots of relevant interactions, code review comments, or release notes.

  3. Analyze Strengths and Weaknesses: Critically assess how well the inferred SDLC model supports the project’s quality, collaboration, and sustainability. Some key areas to consider:

    • Agility & Responsiveness: Speed of adapting to new requests/bugs.

    • Collaboration & Communication: Effectiveness of community interaction.

    • Quality & Reliability: Thoroughness of testing and reviews.

    • Scalability & Sustainability: Long-term viability and growth potential.

  4. Reflection and Summary: Relate your findings to the fundamental layers of software engineering (process, methods, tools, quality focus). For example, if the project heavily relies on CI tools, discuss how this ties into the “tools” layer and fosters quality assurance.

    1. Implications for Potential Contributions: Highlight key takeaways for future involvement. Some key questions to consider:

      • Which processes must you follow to propose new features or fixes?

      • Are there established testing/coding protocols to adhere to?

      • How might you effectively communicate with maintainers and contributors?


Grading Rubric#

Criterion

Excellent

Good

Fair

Poor

Points

1. SDLC Analysis (60 pts)

54–60 points: Compelling, well-supported SDLC inference with robust evidence (links/screenshots). Thorough analysis of strengths/weaknesses. Strong grasp of SDLC.

40–53 points: Reasonable SDLC inference with sufficient evidence. Adequate strengths/weaknesses discussion. Moderate depth of SDLC understanding.

20–39 points: Basic inference with limited or unclear evidence. Shallow strengths/weaknesses discussion. Superficial SDLC grasp.

0–19 points: Lacks coherent SDLC analysis. Little/no evidence; minimal strengths/weaknesses discussion. Poor or no SDLC grasp.

/60

2. Reflection & Summary (25 pts)

21–25 points: Thoughtful reflection linking SDLC practices to software engineering principles. Clear implications for contribution.

16–20 points: Adequate reflection. Some link to principles. Mentions implications but lacks depth/clarity.

8–15 points: Superficial reflection. Unclear link to software engineering concepts. Minimal implications for contributions.

0–7 points: No meaningful reflection. Fails to connect to broader concepts or discuss contribution implications.

/25

3. Clarity & Presentation (15 pts)

13–15 points: Exceptionally clear, well-organized, error-free writing. Professional tone. Links/screenshots labeled effectively.

9–12 points: Well-organized with minor errors. Effective use of headings/formatting. Links/screenshots used fairly well.

5–8 points: Understandable, but contains errors or organizational issues. Links/screenshots may be unclear or poorly labeled.

0–4 points: Disorganized, frequent errors, minimal or missing screenshots/links. Difficult to follow.

/15

Total

/100