The Most Important Refactoring Was Deleting 500 Lines I Was Proud Of

go dev.to

How I deleted a generic Pipeline[T] framework and a pass-through workflow layer, and why the codebase got better by having less code.

The hardest refactoring isn't adding something new. It's deleting something you built.

I built a generic Pipeline[T] framework for the output rendering layer of a Go CLI. It had a fluent API with NewPipeline().Then().Then().Error().Steps(). It had logging decorators. It had error recovery. It handled step sequencing with generics.

It was 500 lines of clever Go.

I deleted all of it and replaced it with 3 sequential function calls.

The Pipeline Framework

The output layer needed to: enrich evaluation results with remediation data, marshal the enriched data to JSON, and write it to stdout. Three steps. Sequential. No branching.

Here's what I built:

// BEFORE: Generic pipeline framework
type Pipeline[T any] struct {
    steps []Step[T]
    errFn func(error) error
}

func NewPipeline[T any]() *Pipeline[T] { ... }
func (p *Pipeline[T]) Then(fn func(T) (T, error)) *Pipeline[T] { ... }
func (p *Pipeline[T]) Error(fn func(error) error) *Pipeline[T] { ... }
func (p *Pipeline[T]) Execute(input T) (T, error) { ... }

// Usage
result, err := pipeline.NewPipeline[EvalResult]().
    Then(enrich).
    Then(marshal).
    Then(write).
    Error(decorateError).
    Execute(rawResult)
Enter fullscreen mode Exit fullscreen mode

It looked elegant. It was reusable. It handled errors uniformly. It was a framework.

It was wrong.

Why It Was Wrong

1. Three steps don't need a framework

// AFTER: Three function calls
enriched, err := enrich(rawResult)
if err != nil {
    return decorateError(err)
}
marshaled, err := marshal(enriched)
if err != nil {
    return decorateError(err)
}
return write(marshaled)
Enter fullscreen mode Exit fullscreen mode

6 lines. No generics. No framework. No learning curve. A junior developer reads this and understands it in 3 seconds.

2. The framework hid the error handling

In the pipeline version, decorateError was registered as a callback via .Error(). When debugging a marshaling failure, you had to trace through the pipeline's Execute method to understand when and how the error decorator was applied.

In the sequential version, decorateError(err) is right there at the call site. You can see it.

3. The framework prevented simple changes

When we needed to add a validation step between enrich and marshal — only for JSON format — the pipeline couldn't express it cleanly. We'd need conditional steps, or a pipeline builder that accepted format-dependent configurations.

In the sequential version:

enriched, err := enrich(rawResult)
if err != nil {
    return decorateError(err)
}
if format.IsJSON() {
    if err := validate(enriched); err != nil {
        return decorateError(err)
    }
}
marshaled, err := marshal(enriched)
Enter fullscreen mode Exit fullscreen mode

An if statement. No framework changes needed.

The Pass-Through Layer

The same pattern appeared at a different level. An app/workflow package existed solely to forward calls:

// BEFORE: Pass-through package
package workflow

func RunEvaluation(deps Deps, cfg Config) (Result, error) {
    return domain.Evaluate(deps.Controls, deps.Snapshots, cfg.MaxUnsafe)
}
Enter fullscreen mode Exit fullscreen mode

Every method in workflow called exactly one domain function with the same arguments. It added no logic, no validation, no transformation. It existed because "the architecture diagram has an app layer between cmd and domain."

// AFTER: Callers invoke domain directly
result, err := domain.Evaluate(controls, snapshots, maxUnsafe)
Enter fullscreen mode Exit fullscreen mode

The entire package was deleted. One fewer layer to navigate, one fewer package to understand, one fewer indirection to trace.

How to Spot Premature Abstractions

The "what does this add?" test

For every abstraction layer, ask: "If I inline this, does the caller get simpler or more complex?"

  • Pipeline → inlining made callers simpler (3 sequential calls vs fluent chain)
  • Workflow → inlining made callers simpler (direct domain call vs wrapper)
  • Adapter → inlining would make callers more complex (CLI concerns would leak into domain)

The third one stays. The first two go.

The "second consumer" test

An abstraction earns its existence when the second consumer appears. The Pipeline framework had exactly one consumer. The workflow layer had exactly one caller per method. Neither had earned its abstraction.

The "grep test"

Search for the abstraction's type name. If every result is either the definition or a single usage, it's premature:

grep -rn "Pipeline\[" --include='*.go' | wc -l
# 2: one definition, one usage. Delete it.
Enter fullscreen mode Exit fullscreen mode

The Lesson

The codebase needed fewer abstractions, not more.

The Pipeline was built because "we might need a complex output pipeline later." We didn't. The workflow layer was built because "hexagonal architecture requires an app layer." It doesn't — not when the app layer adds zero logic.

Both took significant effort to build and significant effort to remove. The build cost was visible (lines of code, review time). The ongoing cost was invisible (cognitive load, debugging indirection, onboarding friction) — until it wasn't.

Deleting your own clever code is the hardest skill in software engineering. It means admitting that the 500 lines you wrote, reviewed, and merged made the project worse, not better.

But 3 lines of sequential Go are more maintainable than a 500-line generic fluent API. Every time. It feels good to eliminate bloat in your codebase.


This was one of the big lessons learned during the development of Stave, an offline configuration safety evaluator.

Source: dev.to

arrow_back Back to Tutorials