Fsdss-281 Apr 2026
### š§© Investigation - **Hypothesis:** Dateāparsing library throws on outāofārange values. - **Evidence:** `date-fns` `parseISO` throws `RangeError` (see log line 1123). - **Next step:** Add explicit validation before calling `parseISO`.
### šÆ Acceptance Criteria - [ ] Criterion 1 - [ ] Criterion 2 - [ ] ... FSDSS-281
Even if youāre not sure what the exact scope of FSDSSā281 is, this template will help you gather the right information, move the work forward efficiently, and keep the whole team in the loop. | Action | Why it matters | How to do it | |--------|----------------|--------------| | Read the ticket title & description | Gives you the initial problem statement or feature request. | Open the ticket in your tracking system (Jira, Azure Boards, GitHub Issues, etc.). | | Check the āIssue Typeā | Determines if itās a bug, improvement, story, spike, or chore. | Look at the field that categorises the ticket. | | Look for attached documents | Specs, mockāups, logs, screenshots, or design docs often hide crucial details. | Expand any āAttachmentsā or āLinksā sections. | | Identify the stakeholder | Knowing who raised the ticket helps you ask the right clarifying questions. | Check the āReporterā, āAssigneeā, āWatchersā, and any āRequested Byā fields. | | Read the comments thread | Past discussion may already contain workāarounds, decisions, or blockers. | Scan chronologically; watch for āā Doneā or āā Open questionā. | | Check related tickets | Dependencies or duplicates affect priority & scope. | Look at āEpic Linkā, āParentā, āBlocks/Is Blocked Byā, āDuplicate Ofā. | Tip: If any of the above is missing (e.g., no description or unclear acceptance criteria), add a short comment asking for clarification before you start digging. 2ļøā£ Gather Context & Environment Details | Item | What to capture | Where to find it | |------|----------------|------------------| | Affected component/module | Name of service, library, UI page, API, etc. | Ticket labels, component field, or codeāsearch. | | Environment(s) | Dev, Staging, Production; OS, browser, device, version numbers. | Ticket, logs, or ask the reporter. | | Reproduction steps | Exact actions that trigger the issue (including data). | Test manually; record steps in a markdown checklist. | | Error messages / Stack traces | Full text, line numbers, correlation IDs. | Console logs, server logs, monitoring tools (Sentry, Datadog). | | Feature flag / config status | Whether a flag is on/off that could affect behaviour. | Config repo, launchdarkly console, environment variables. | | Recent changes | Commits, releases, migrations, DB schema changes that happened just before the problem surfaced. | Git history ( git log -p ), release notes, git bisect start point. | | Performance metrics | Latency spikes, memory usage, CPU, DB query times. | APM dashboards, CloudWatch, New Relic. | ### šÆ Acceptance Criteria - [ ] Criterion
### š Summary *(Oneāsentence description of the problem / feature)* | Open the ticket in your tracking system