Skip to content

Commit be358a0

Browse files
authored
Claude Code setup / Issue #48 Write speed test results to file (#81)
1 parent f6c72e6 commit be358a0

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

42 files changed

+3506
-1090
lines changed

.claude/CLAUDE.md

Lines changed: 369 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,369 @@
1+
# NetPace Development Guide
2+
3+
## Summary
4+
5+
**TDD (Test-Driven Development) is non-negotiable.** Every line of production code must be written in response to a failing test. No exceptions.
6+
7+
This document guides Claude Code in maintaining NetPace, a cross-platform .NET 8.0 CLI application for network speed testing. Follow these standards strictly:
8+
9+
- **RED-GREEN-REFACTOR**: Write failing test → Make it pass → Improve code
10+
- **C# best practices**: XML docs, async/await, nullable reference types
11+
- **CLI excellence**: Follow clig.dev guidelines, Spectre.Console for UI
12+
- **Clean architecture**: Core library separate from Console application
13+
- **Cross-platform**: Windows, Linux, macOS support
14+
15+
## Core Philosophy
16+
17+
### Test-Driven Development is Mandatory
18+
19+
**TDD is not optional.** Every single change to production code must follow the RED-GREEN-REFACTOR cycle:
20+
21+
1. **RED**: Write a failing test first
22+
- No production code without a failing test
23+
- Test describes the behavior you want
24+
- Run test and watch it fail (confirms test is valid)
25+
26+
2. **GREEN**: Write minimum code to pass
27+
- Write only enough code to make the test pass
28+
- Don't add features "while you're there"
29+
- Get to green as quickly as possible
30+
31+
3. **REFACTOR**: Improve the code
32+
- Only after test passes
33+
- Improve design, remove duplication, enhance readability
34+
- Tests must still pass after refactoring
35+
36+
**Critical Rules:**
37+
- **Never write production code without a failing test first**
38+
- **Never skip the RED step** - you must see the test fail
39+
- **Never refactor on red** - always get to green first
40+
- **Commit before refactoring** - so you can safely rollback if needed
41+
- **Run all tests frequently** - catch regressions immediately
42+
43+
### Why This Matters
44+
45+
TDD provides:
46+
- **Design feedback** - Hard to test = bad design
47+
- **Regression protection** - Changes don't break existing behavior
48+
- **Living documentation** - Tests show how code should be used
49+
- **Confidence** - Refactor safely knowing tests will catch issues
50+
51+
## Project Overview
52+
53+
NetPace is a cross-platform network speed testing CLI application built with .NET 8.0, utilizing Ookla's Speedtest servers. It includes both a command-line application and a reusable Core library published to NuGet.
54+
55+
**Key Components:**
56+
- `NetPace.Console` - Command-line application using Spectre.Console
57+
- `NetPace.Core` - Reusable library with `ISpeedTestService` interface
58+
- `NetPace.Core` published as NuGet package
59+
60+
## C# and .NET Standards
61+
62+
### Language and Framework
63+
- Target Framework: **.NET 8.0**
64+
- Language Version: **C# 12** (latest for .NET 8)
65+
- Nullable Reference Types: **Enabled** (helps prevent null reference exceptions)
66+
67+
### Naming Conventions
68+
- **PascalCase** for: Classes, methods, properties, namespaces, public fields
69+
- **camelCase** for: Private fields, local variables, parameters
70+
- **Interfaces** start with `I`: `ISpeedTestService`
71+
- **Async methods** end with `Async`: `GetServersAsync()`
72+
73+
### Code Organization
74+
- **One class per file** (with exceptions for small, tightly related types)
75+
- **File names match type names**: `OoklaSpeedtest.cs` contains `OoklaSpeedtest` class
76+
- **Namespace matches folder structure**: `NetPace.Core.Clients.Ookla`
77+
78+
### Best Practices
79+
- **Use interfaces** for abstraction and testability (like `ISpeedTestService`)
80+
- **Favor immutability**: Use `readonly` fields, consider `record` types for DTOs
81+
- **Avoid magic strings/numbers**: Use constants or enums
82+
- **Use `var`** when type is obvious: `var result = GetResult();`
83+
- **Explicit types** when clarity helps: `ISpeedTestService speedTester = ...`
84+
- **XML documentation** on all public APIs (methods, properties, classes)
85+
- **Async all the way**: Network operations should be async with proper cancellation token support
86+
87+
### Error Handling
88+
- **Don't swallow exceptions** - let them bubble unless you can meaningfully handle them
89+
- **Use specific exception types** when creating custom exceptions
90+
- **Validate inputs** early (guard clauses at method start)
91+
92+
## CLI Application Specific Guidelines
93+
94+
### Command-Line Interface
95+
- Follow **[CLI Guidelines](https://clig.dev/)** (as per project philosophy)
96+
- Use **Spectre.Console** for all console output and interaction
97+
- Support **`--help`** and **`--version`** flags
98+
- Provide **clear error messages** with actionable guidance
99+
- Support **multiple output formats** (normal, CSV, JSON) for scripting
100+
101+
### User Experience
102+
- **Default behavior should work for most users**: `NetPace` runs a simple test
103+
- **Progress indication** for long-running operations
104+
- **Verbosity levels**: Minimal (scripts), Normal (users), Debug (troubleshooting)
105+
- **Cross-platform** considerations: file paths, line endings, console encoding
106+
107+
### Configuration
108+
- Use **command-line options** over config files (CLI app principle)
109+
- **Sensible defaults** - users shouldn't need to specify everything
110+
- **Validate user input** and provide helpful error messages
111+
112+
## Testing
113+
114+
### Test Organization
115+
- Test project naming: `NetPace.Core.Tests`, `NetPace.Console.Tests`
116+
- Use **xUnit** for testing framework
117+
- **Given-When-Then** or **Arrange-Act-Assert** pattern in tests
118+
- Test file mirrors source: `OoklaSpeedtest.cs``OoklaSpeedtestTests.cs`
119+
120+
### TDD Workflow in Practice
121+
122+
For every feature or bug fix:
123+
```csharp
124+
// 1. RED - Write the failing test first
125+
[Fact]
126+
public async Task GetDownloadSpeed_WhenServerResponds_ReturnsValidSpeed()
127+
{
128+
// Given: A speed test service with a valid server
129+
var service = new OoklaSpeedtest();
130+
var server = new Server { Url = "http://test.example.com" };
131+
132+
// When: We get the download speed
133+
var result = await service.GetDownloadSpeedAsync(server);
134+
135+
// Then: Result should be valid
136+
Assert.NotNull(result);
137+
Assert.True(result.SpeedBitsPerSecond > 0);
138+
}
139+
140+
// Run test - it MUST fail (because GetDownloadSpeedAsync doesn't exist yet)
141+
142+
// 2. GREEN - Write minimum implementation
143+
public async Task<DownloadResult> GetDownloadSpeedAsync(Server server)
144+
{
145+
// Simplest thing that makes test pass
146+
return new DownloadResult { SpeedBitsPerSecond = 1000000 };
147+
}
148+
149+
// Run test - it should pass now
150+
151+
// 3. REFACTOR - Now improve the implementation
152+
public async Task<DownloadResult> GetDownloadSpeedAsync(Server server)
153+
{
154+
// Now add proper implementation
155+
var client = new HttpClient();
156+
var stopwatch = Stopwatch.StartNew();
157+
var bytes = await client.GetByteArrayAsync(server.Url);
158+
stopwatch.Stop();
159+
160+
var bitsPerSecond = (bytes.Length * 8) / stopwatch.Elapsed.TotalSeconds;
161+
return new DownloadResult { SpeedBitsPerSecond = bitsPerSecond };
162+
}
163+
164+
// Run test - still passes after refactoring
165+
```
166+
167+
### What to Test
168+
- **NetPace.Core**: Unit tests for all public APIs
169+
- **Business logic**: Speed calculations, unit conversions, server selection
170+
- **Happy paths**: Normal successful scenarios
171+
- **Alternative scenarios**: Different configurations, edge cases
172+
- **Error scenarios**: Invalid input, network failures, timeouts
173+
- **Integration tests**: Real network calls (consider separate test category)
174+
175+
### What NOT to Test
176+
- **Spectre.Console output** - trust the library works
177+
- **Simple property getters/setters** with no logic
178+
- **Third-party libraries** - assume they work
179+
180+
### Test Quality Standards
181+
- Tests should be **readable** - another developer should understand what's being tested
182+
- Tests should be **independent** - can run in any order
183+
- Tests should be **fast** - entire test suite runs in seconds
184+
- Tests should be **deterministic** - same input = same result, every time
185+
- Mock external dependencies (network, filesystem, time) for unit tests
186+
187+
## Architecture Principles
188+
189+
### Separation of Concerns
190+
- **NetPace.Core**: Business logic, no UI, no console output
191+
- **NetPace.Console**: User interaction, parsing args, formatting output
192+
- Core library should be **usable in any context** (console, web API, GUI)
193+
194+
### Dependency Injection Ready
195+
- Design with DI in mind even if not using a container
196+
- Depend on **interfaces, not concrete implementations**
197+
- Constructor injection for dependencies
198+
199+
### NuGet Package Considerations
200+
- **Keep Core library dependencies minimal** (fewer version conflicts for consumers)
201+
- **Document breaking changes** in release notes
202+
- **Semantic versioning**: MAJOR.MINOR.PATCH
203+
204+
## Project-Specific Guidelines
205+
206+
### Speed Test Provider Pattern
207+
- All speed test implementations should implement `ISpeedTestService`
208+
- Currently using Ookla, but architecture allows for alternatives
209+
- Keep provider-specific code isolated in `Clients/{ProviderName}/`
210+
211+
### Units and Formatting
212+
- Support both **SI (1000-based)** and **IEC (1024-based)** unit systems
213+
- Support both **BitsPerSecond** and **BytesPerSecond**
214+
- Auto-scale by default (Mbps, Gbps) but allow user override
215+
- Consistent formatting across output modes (normal, CSV, JSON)
216+
217+
### Performance
218+
- **Async operations** for all network calls
219+
- Consider **HttpClient best practices** (singleton, pooling)
220+
- **CancellationToken** support for long operations
221+
- Measure and optimize **hot paths** (speed test loops)
222+
223+
## Development Workflow
224+
225+
### Starting New Work
226+
227+
1. **Pull latest from main**
228+
```bash
229+
git checkout main
230+
git pull origin main
231+
```
232+
233+
2. **Create feature branch**
234+
```bash
235+
git checkout -b feature/your-feature-name
236+
```
237+
238+
3. **Review CLAUDE.md** for project standards
239+
240+
### The TDD Cycle (For Every Change)
241+
```
242+
┌─────────────────────────────────────────────┐
243+
│ 1. RED - Write failing test │
244+
│ - Describes desired behavior │
245+
│ - Run and watch it FAIL │
246+
└──────────────┬──────────────────────────────┘
247+
248+
249+
┌─────────────────────────────────────────────┐
250+
│ 2. GREEN - Make test pass │
251+
│ - Write minimum code needed │
252+
│ - Run and watch it PASS │
253+
└──────────────┬──────────────────────────────┘
254+
255+
256+
┌─────────────────────────────────────────────┐
257+
│ 3. REFACTOR - Improve code (optional) │
258+
│ - Commit before refactoring │
259+
│ - Improve design/remove duplication │
260+
│ - Run tests - still PASS │
261+
└──────────────┬──────────────────────────────┘
262+
263+
264+
Back to RED for next behavior
265+
```
266+
267+
### During Development
268+
1. **Follow TDD cycle** for every behavior change
269+
2. Add **XML documentation** to public APIs as you go
270+
3. **Commit frequently** - especially before refactoring
271+
4. Run **full test suite** regularly
272+
273+
### Before Committing
274+
- Build succeeds with **no warnings**
275+
- **All tests pass**
276+
- Code follows **naming conventions**
277+
- Public APIs have **XML documentation**
278+
- No **commented-out code** (delete it, git remembers)
279+
280+
### Commit Messages
281+
- **Clear and concise**: "Add support for custom server URLs"
282+
- **Imperative mood**: "Add feature" not "Added feature"
283+
- **Reference issues** if applicable: "Fix #123: Handle null server response"
284+
285+
## Common Patterns in This Project
286+
287+
### Result Objects
288+
Prefer returning result objects with rich information:
289+
```csharp
290+
public class DownloadResult
291+
{
292+
public double SpeedBitsPerSecond { get; init; }
293+
public TimeSpan Duration { get; init; }
294+
public long BytesTransferred { get; init; }
295+
296+
public string GetSpeedString(SpeedUnit unit, SpeedUnitSystem system) { ... }
297+
}
298+
```
299+
300+
### Extension Methods
301+
Use extension methods for formatting and conversion logic that doesn't belong in core types.
302+
303+
### Options Pattern
304+
For complex configuration, use options objects instead of many parameters:
305+
```csharp
306+
public async Task<DownloadResult> GetDownloadSpeedAsync(
307+
Server server,
308+
DownloadTestSettings? settings = null)
309+
```
310+
311+
## When Working with Claude Code
312+
313+
### Claude Must Always
314+
- **Follow TDD** - write failing test before any production code
315+
- Follow the **RED-GREEN-REFACTOR** cycle
316+
- Add **XML documentation** to public members
317+
- Consider **cross-platform compatibility**
318+
- Write **testable code** (interfaces, dependency injection)
319+
- Ask for clarification if requirements are ambiguous
320+
321+
### Tell Claude About
322+
- Which component you're working on (Core vs Console)
323+
- Whether changes affect the public API (NuGet consumers)
324+
- Platform-specific considerations
325+
- Performance requirements
326+
327+
### Never Let Claude
328+
- Write production code without a failing test first
329+
- Skip the RED step (must see test fail)
330+
- Change public APIs without discussion
331+
- Add dependencies to Core without good reason
332+
- Skip error handling
333+
- Commit code with failing tests
334+
335+
### Example Interaction
336+
337+
**Good:**
338+
```
339+
You: Add validation to ensure server URL is not null
340+
Claude: I'll follow TDD. First, I'll write a test that expects an
341+
ArgumentNullException when server URL is null...
342+
[writes failing test]
343+
[makes it pass]
344+
[suggests refactoring if appropriate]
345+
```
346+
347+
**Bad:**
348+
```
349+
You: Add validation to ensure server URL is not null
350+
Claude: Here's the updated code with null checking...
351+
[provides implementation without test]
352+
STOP - This violates TDD!
353+
```
354+
355+
## Resources
356+
357+
- [.NET API Documentation](https://learn.microsoft.com/en-us/dotnet/api/)
358+
- [C# Coding Conventions](https://learn.microsoft.com/en-us/dotnet/csharp/fundamentals/coding-style/coding-conventions)
359+
- [CLI Guidelines](https://clig.dev/)
360+
- [Spectre.Console Documentation](https://spectreconsole.net/)
361+
- [xUnit Documentation](https://xunit.net/)
362+
- [Test-Driven Development by Example (Kent Beck)](https://www.amazon.com/Test-Driven-Development-Kent-Beck/dp/0321146530)
363+
364+
---
365+
366+
**Last Updated**: November 2025
367+
**Maintained by**: Frank Ray
368+
**Project**: https://github.com/FrankRay78/NetPace
369+
**Philosophy**: Test-Driven Development is non-negotiable

0 commit comments

Comments
 (0)