Implement computeEffectiveP internal helper and workflowCost public function
that captures the structural reality that upstream failures multiply downstream
damage, per ADR-004 and the Python research model.
- computeEffectiveP: computes pEffective from intrinsic probability + upstream
propagation using inherited quality factors (parentP + (1-parentP) × qualityRetention)
- workflowCost: processes tasks in topological order, computes per-task EV with
degraded effective probability, includes pIntrinsic/pEffective split
- Supports independent and dag-propagate modes
- Completed tasks excluded from results but propagate p=1.0 when includeCompleted: false
- Per-edge qualityRetention overrides defaultQualityRetention option
- Throws CircularDependencyError for cyclic graphs via topologicalOrder
- 30+ new tests covering chain compounding, diamond graph, mode comparison,
completed task semantics, cycle detection, per-edge qualityRetention
Implement the core EV calculation: EV = p*C_success + (1-p)*C_fail
where C_success = scopeCost*impactWeight, C_fail = scopeCost*impactWeight + fallbackCost + timeLost*expectedRetries.
- expectedRetries = (1-p)/p when p>0, else 0 (geometric series)
- Caps expectedRetries at config.retries when retries > 0
- Multiplies final EV by config.valueRate when non-zero
- 30 unit tests covering formula, edge cases, and Python research model values
Implements graph attribute schemas and the SerializedGraph generic factory
parameterized with <N, E, G> following the graphology JSON format.
- TaskGraphNodeAttributes: name + optional categorical enums (scope, risk,
impact, level, priority, status) — analysis-relevant metadata only
- TaskGraphNodeAttributesUpdate: Type.Partial(TaskGraphNodeAttributes)
- TaskGraphEdgeAttributes: optional qualityRetention number
- SerializedGraph<N, E, G>: generic factory for graphology JSON format
- TaskGraphSerialized: concrete instantiation with empty graph attributes
- No schema version field per spec
35 new tests covering validation, rejection, and compile-time type safety.
- Add TaskInput schema with all fields per architecture (id, name, dependsOn,
categorical fields as Optional(Nullable(...)), metadata fields)
- Add DependencyEdge schema with from, to, qualityRetention fields
- Re-export Nullable helper from task.ts for convenience
- Add type aliases: TaskInput, DependencyEdge via Static<typeof>
- Add 49 tests covering validation, nullable fields, edge cases, type correctness
Break the @alkdev/taskgraph architecture specs into dependency-ordered
implementation tasks across 8 component directories: setup, schema,
error, graph, analysis, cost-benefit, frontmatter, api, and review.
Each task has clear acceptance criteria referencing specific architecture
docs. Three review tasks serve as quality gates at critical junction
points (schemas-and-errors, graph-complete, complete-library). The
dependency graph is validated acyclic with 9 topological levels enabling
significant parallelism across independent work streams.
Critical fixes:
- Rename qualityDegradation → qualityRetention across all docs
(semantically inverted: 0.9 meant 90% quality RETAINED, not 90%
degradation). Updated schemas, graph-model, cost-benefit, ADRs.
- Add TaskInput → TaskGraphNodeAttributes transformation section
to graph-model.md, documenting how Nullable(Optional) input fields
map to Optional graph attributes
- Fix DuplicateEdgeError fields: source/target → prerequisite/dependent
to match the established edge direction convention
- Fix resolveDefaults signature: Partial<TaskGraphNodeAttributes>
→ Partial<...> & Pick<TaskGraphNodeAttributes, 'name'> to
require the name field
- Move Nullable helper definition before its first use in schemas.md
- Fix 'construction never throws' contradiction: rephrase to
'construction enforces uniqueness, not data quality'
- Define all 6 enum value sets in schemas.md (previously only
TaskScope and TaskRisk were explicit)
- Add EvConfig parameter table with defaults and semantics
- Document WorkflowCostOptions.limit parameter
- Add construction error handling table to graph-model.md
- Add graph.raw mutation safety warning to api-surface.md
- Update build-distribution.md error class list to include
DuplicateNodeError and DuplicateEdgeError
Explores the diff-based approach (TypeBox Value.Diff → graphology mutation
mapping) as an alternative to rebuild-on-change. Key findings:
- The diff must happen at the graph level, not the source level, because
TaskInput.dependsOn doesn't directly map to edge mutations
- graphology's import(merge=true) handles merges but not deletions
- The real win is reactivity (fine-grained event notifications), not performance
- For <200 node graphs, rebuild is always sub-millisecond
- A hybrid approach (diff for attribute-only changes, rebuild for structural
changes) is possible but adds significant complexity
Decision: defer to v2. ADR-002 (rebuild) stands. The exploration is preserved
for future reference.
- schemas.md: Replace interface ResolvedTaskAttributes with TypeBox schema
+ Static<typeof> derivation (was the only raw interface in the doc set)
- schemas.md: Add explicit TypeBox naming convention table and pattern guide
- schemas.md: Use Nullable() helper for TaskInput optional categorical fields
that can be explicitly set to null in YAML
- schemas.md: Reference typebox-patterns.md research for full analysis
- api-surface.md: Add note about Static<typeof> pattern consistency
- errors-validation.md: Use Value.Errors() for structured validation instead
of bare Value.Check()
- New: docs/research/typebox-patterns.md — comprehensive TypeBox pattern
evaluation covering Static, Values, Convert, Pointer, TemplateLiteral,
generics, defaults, and concrete schema recommendations
Two additions based on learnings from taskgraph architecture decomposition:
- Gather Requirements now explicitly mentions reading downstream consumer
architecture to understand constraints, while noting that consumer dispatch
details belong in the consumer's own docs, not the library's
- Anti-patterns adds 'Consumer dispatch in library docs' — describe what
consumers need, not how they dispatch it
The 751-line architecture.md violated the SDD process modular documentation
target (~500 lines). It also had duplicate TaskGraph class definitions (one
monolith, one decomposed) that directly contradicted each other, and embedded
consumer-specific tool dispatch mappings that belong in downstream projects.
Changes:
- Split into 8 focused documents + 7 ADR records + redirect page
- Removed the monolithic TaskGraph class (kept only decomposed version)
- Moved CLI→plugin dispatch mapping out (belongs in plugin architecture)
- Extracted implementation code (frontmatter splitter, findCycles, DAG
propagation) into WHAT/WHY descriptions per architect role spec
- Added proper ADR format for all resolved design decisions
- Fixed review issues: C_fail mapping, DuplicateNodeError/DuplicateEdgeError
types, ValidationError/GraphValidationError definitions, mutation error
handling contract, enum naming convention, validation timing clarification
Replace outdated worktree_make/worktree_mode/worktree_overview/worktree_cleanup
API references with the new worktree({action, args}) dispatch pattern from
open-coordinator. Add worktree notify tool for implementation/code-review/POC
agents to communicate back to coordinator. Fix stale alkhub_ts paths. Add
open-memory plugin awareness (memory, memory_compact) to AGENTS.md and agent
specs. Update sdd_process.md coordinator section accordingly.
Replace interfaces with typebox equivalents for return types.
Resolve 6 open design questions:
1. Rebuild over incremental (graph sizes too small to matter)
2. Strict internal-only subgraph (matches graphology-operators)
3. Throw CircularDependencyError on cycles (both consumers treat as bugs)
4. Always propagate through completed nodes, exclude from output only
5. Defer depth-escalation to v2 (multiplicative compounding already captures it)
6. Adopt source->target edge keys from the start
Add section decomposing TaskGraph into thin data class + standalone
analysis functions, with operations pattern at the consumer layer.