Manual Testing
[r# Subnetter Manual Testing Guide
This guide provides detailed instructions for manually testing the Subnetter application. It is designed to validate that the application functions correctly across various scenarios, edge cases, and environments.
Table of Contents
- Prerequisites
- Setup for Testing
- Basic Functionality Tests
- Configuration Validation Tests
- CIDR Allocation Tests
- Edge Case Testing
- Output Validation
- Performance Testing
- Cross-Platform Testing
- CLI Options Testing
- Integration Testing
- Module Compatibility Testing
- Regression Testing
- Test Reporting
Prerequisites
Before beginning manual testing, ensure you have the following:
- Node.js Environment: Node.js v18 or higher installed (v22+ recommended)
- Git: For cloning the repository
- Text Editor: For editing configuration files
- Terminal/Command Line: For running the application
- CSV Viewer: For examining output files (e.g., Excel, Google Sheets)
- Subnetter Source Code: Clone the repository or install via npm
Setup for Testing
Local Development Installation
-
Clone the repository:
Terminal window git clone https://github.com/gangster/subnetter.gitcd subnetter -
Install dependencies using Yarn (the project uses zero-install configuration):
Terminal window # The project includes Yarn in the .yarn/releases directory./yarn install -
Build the project:
Terminal window ./yarn build -
Link for local testing:
Terminal window # If using npm for global linknpm link# If using yarn./yarn link
NPM Installation Testing
Test the installation process as an end-user would experience it:
-
Create a new directory for testing:
Terminal window mkdir subnetter-testcd subnetter-test -
Install Subnetter:
Terminal window # Using npmnpm install -g subnetter# Using yarnyarn global add subnetter -
Verify installation:
Terminal window subnetter --version
Basic Functionality Tests
Command Availability Test
Objective: Verify that all commands are available and respond correctly.
Steps:
- Run the help command:
Terminal window subnetter --help - Verify that the output displays all available commands and options.
Expected Results:
- Help text should be displayed with all commands:
generate
,validate
- Options should be listed for each command
- No errors should be thrown
Version Command Test
Objective: Verify that the version command works correctly.
Steps:
- Run the version command:
Terminal window subnetter --version
Expected Results:
- The current version number should be displayed
- Format should be semantic versioning (e.g., 1.0.0)
Generate Command Basic Test
Objective: Verify that the generate command works with a minimal valid configuration.
Steps:
-
Create a minimal configuration file
minimal-config.json
:{"baseCidr": "10.0.0.0/8","cloudProviders": ["aws"],"accounts": [{"name": "test-account","clouds": {"aws": {"regions": ["us-east-1"]}}}],"subnetTypes": {"Public": 24,"Private": 26}} -
Run the generate command:
Terminal window subnetter generate minimal-config.json -o minimal-output.csv
Expected Results:
- Command should complete without errors
- Output file
minimal-output.csv
should be created - Output should contain 6 allocations (1 account × 1 region × 3 AZs × 2 subnet types)
- Each allocation should have the correct fields (Account Name, VPC Name, etc.)
Validate Command Basic Test
Objective: Verify that the validate command works correctly.
Steps:
- Use the same minimal configuration file from above.
- Run the validate command:
Terminal window subnetter validate -c minimal-config.json
Expected Results:
- Command should complete without errors
- Output should indicate that the configuration is valid
YAML Configuration Test
Objective: Verify that YAML configuration files are supported and work correctly.
Steps:
-
Create a minimal YAML configuration file
minimal-config.yaml
:# Minimal YAML configurationbaseCidr: 10.0.0.0/8cloudProviders:- awsaccounts:- name: test-accountclouds:aws:regions:- us-east-1subnetTypes:Public: 24Private: 26 -
Run the validate command:
Terminal window subnetter validate -c minimal-config.yaml -
Run the generate command:
Terminal window subnetter generate -c minimal-config.yaml -o yaml-output.csv
Expected Results:
- Both commands should complete without errors
- Validation should indicate that the configuration is valid
- Generation should produce the same allocations as with the JSON configuration
- Output file should contain 6 allocations (1 account × 1 region × 3 AZs × 2 subnet types)
Configuration Validation Tests
Invalid Configuration Test
Objective: Verify that invalid configurations are rejected with clear error messages.
Test Case 1: Missing Required Fields
Steps:
-
Create
invalid-missing-fields.json
:{"baseCidr": "10.0.0.0/8","accounts": [{"name": "test-account","clouds": {"aws": {"regions": ["us-east-1"]}}}]// Missing subnetTypes and cloudProviders} -
Run validation:
Terminal window subnetter validate -c invalid-missing-fields.json
Expected Results:
- Validation should fail
- Error message should mention the missing required fields
- Error message should be clear and actionable, indicating what needs to be fixed
- Exit code should be non-zero
Criteria for Evaluating Error Messages:
- Specificity: Error messages should identify the specific issue
- Location: Error messages should indicate where in the configuration the issue is
- Actionability: Error messages should suggest how to fix the issue
- Clarity: Error messages should be understandable to users without deep knowledge of the codebase
Test Case 2: Invalid CIDR Format
Steps:
-
Create
invalid-cidr-format.json
:{"baseCidr": "10.0.0.0/33", // Invalid prefix length"cloudProviders": ["aws"],"accounts": [{"name": "test-account","clouds": {"aws": {"regions": ["us-east-1"]}}}],"subnetTypes": {"Public": 24}} -
Run validation:
Terminal window subnetter validate invalid-cidr-format.json
Expected Results:
- Validation should fail
- Error message should mention the invalid CIDR format
- Error should point to the
baseCidr
field
Test Case 3: Empty Account Name
Steps:
-
Create
invalid-empty-account-name.json
:{"baseCidr": "10.0.0.0/8","cloudProviders": ["aws"],"accounts": [{"name": "", // Empty name"clouds": {"aws": {"regions": ["us-east-1"]}}}],"subnetTypes": {"Public": 24}} -
Run validation:
Terminal window subnetter validate invalid-empty-account-name.json
Expected Results:
- Validation should fail
- Error message should mention the empty account name
- Error should point to the
accounts[0].name
field
CIDR Allocation Tests
Multi-Account Allocation Test
Objective: Verify that CIDR allocation works correctly with multiple accounts.
Steps:
-
Create
multi-account-config.json
:{"baseCidr": "10.0.0.0/8","prefixLengths": {"account": 16,"region": 20,"az": 24},"cloudProviders": ["aws"],"accounts": [{"name": "dev-account","clouds": {"aws": {"regions": ["us-east-1", "us-west-2"]}}},{"name": "prod-account","clouds": {"aws": {"regions": ["us-east-1", "eu-west-1", "ap-southeast-1"]}}}],"subnetTypes": {"Public": 26,"Private": 28}} -
Run the generate command:
Terminal window subnetter generate multi-account-config.json -o multi-account-output.csv
Expected Results:
- Command should complete without errors
- Output should contain 30 allocations (2 accounts with 2+3 regions = 5 regions total, 5 regions × 3 AZs × 2 subnet types = 30 allocations)
- CIDRs should be correctly nested with no overlaps
- First account should use 10.0.0.0/16 and second should use 10.1.0.0/16
Multi-Cloud Provider Test
Objective: Verify allocation across multiple cloud providers.
Steps:
-
Create
multi-cloud-config.json
:{"baseCidr": "10.0.0.0/8","prefixLengths": {"account": 16,"region": 20,"az": 24},"cloudProviders": ["aws", "azure", "gcp"],"accounts": [{"name": "cloud-dev","clouds": {"aws": {"regions": ["us-east-1"]},"azure": {"regions": ["eastus"]},"gcp": {"regions": ["us-central1"]}}}],"subnetTypes": {"Public": 26,"Private": 27}} -
Run the generate command:
Terminal window subnetter generate multi-cloud-config.json -o multi-cloud-output.csv
Expected Results:
- Command should complete without errors
- Output should contain 18 allocations (1 account × 3 providers × 1 region per provider × 3 AZs × 2 subnet types = 18 allocations)
- Each provider should have the correct region and AZ naming patterns
- AWS should have AZs like us-east-1a, us-east-1b, us-east-1c
- Azure should have AZs like eastusa, eastusb, eastusc
- GCP should have AZs like us-central1a, us-central1b, us-central1c
Account-Specific CIDR Override Test
Objective: Verify that account-specific CIDR blocks are respected.
Steps:
-
Create
account-cidr-override-config.json
:{"baseCidr": "10.0.0.0/8","cloudProviders": ["aws"],"accounts": [{"name": "default-account","clouds": {"aws": {"regions": ["us-east-1"]}}},{"name": "override-account","clouds": {"aws": {"baseCidr": "172.16.0.0/12","regions": ["us-east-1"]}}}],"subnetTypes": {"Public": 24}} -
Run the generate command:
Terminal window subnetter generate account-cidr-override-config.json -o account-cidr-override-output.csv
Expected Results:
- Command should complete without errors
default-account
should use CIDRs from 10.0.0.0/8 rangeoverride-account
should use CIDRs from 172.16.0.0/12 range- No CIDR overlaps should occur
Different Subnet Sizes Test
Objective: Verify that different subnet sizes are correctly allocated.
Steps:
-
Create
subnet-sizes-config.json
:{"baseCidr": "10.0.0.0/8","cloudProviders": ["aws"],"accounts": [{"name": "test-account","clouds": {"aws": {"regions": ["us-east-1"]}}}],"subnetTypes": {"Public": 24,"Private": 25,"Data": 26,"Management": 27}} -
Run the generate command:
Terminal window subnetter generate subnet-sizes-config.json -o subnet-sizes-output.csv
Expected Results:
- Command should complete without errors
- Public subnets should have /24 CIDR blocks (254 usable IPs)
- Private subnets should have /25 CIDR blocks (126 usable IPs)
- Data subnets should have /26 CIDR blocks (62 usable IPs)
- Management subnets should have /27 CIDR blocks (30 usable IPs)
- Usable IPs column should correctly reflect these numbers
Edge Case Testing
Minimum Prefix Length Test
Objective: Test with the smallest possible prefix length (largest network).
Steps:
-
Create
min-prefix-config.json
:{"baseCidr": "0.0.0.0/0", // Entire IPv4 address space"cloudProviders": ["aws"],"accounts": [{"name": "test-account","clouds": {"aws": {"regions": ["us-east-1"]}}}],"subnetTypes": {"Public": 1}} -
Run the generate command:
Terminal window subnetter generate min-prefix-config.json -o min-prefix-output.csv
Expected Results:
- Command should complete without errors
- Allocations should be valid
- Initial allocation should be 0.0.0.0/0
- Subsequent allocations should use prefix length 1
Maximum Prefix Length Test
Objective: Test with the largest possible prefix length (smallest network).
Steps:
-
Create
max-prefix-config.json
:{"baseCidr": "10.0.0.0/24","cloudProviders": ["aws"],"accounts": [{"name": "test-account","clouds": {"aws": {"regions": ["us-east-1"]}}}],"subnetTypes": {"Public": 32}} -
Run the generate command:
Terminal window subnetter generate max-prefix-config.json -o max-prefix-output.csv
Expected Results:
- Command should complete without errors
- Subnet CIDRs should have /32 prefix length
- Usable IPs should be 1 for each subnet
Large Number of Allocations Test
Objective: Test with a large number of allocations to check performance and memory usage.
Steps:
-
Create
large-config.json
with many accounts, regions, and subnet types (sample):{"baseCidr": "10.0.0.0/8","cloudProviders": ["aws", "azure", "gcp"],"accounts": [// Create 10 accounts with 5 regions each{"name": "account-1","clouds": {"aws": {"regions": ["us-east-1", "us-east-2", "us-west-1", "us-west-2", "eu-west-1"]}}},// ... (repeat for account-2 through account-10)],"subnetTypes": {"Type1": 24,"Type2": 24,// ... (repeat for Types 3-10)}} -
Run the generate command with time tracking:
Terminal window time subnetter generate large-config.json -o large-output.csv
Expected Results:
- Command should complete without errors or excessive memory usage
- Output should contain all allocations (10 accounts × 5 regions × 3 AZs × 10 subnet types = 1500 allocations)
- Execution time should be reasonable (under 30 seconds)
Not Enough Space Error Test
Objective: Verify that appropriate errors are thrown when there’s not enough address space.
Steps:
-
Create
not-enough-space-config.json
:{"baseCidr": "10.0.0.0/22", // Small CIDR block"cloudProviders": ["aws"],"accounts": [{"name": "test-account","clouds": {"aws": {"regions": ["us-east-1", "us-east-2", "us-west-1", "us-west-2"]}}}],"subnetTypes": {"Public": 24}} -
Run the generate command:
Terminal window subnetter generate not-enough-space-config.json -o error-output.csv
Expected Results:
- Command should fail with an error
- Error message should indicate insufficient space
- Error message should be clear and actionable
Output Validation
CSV Format Test
Objective: Verify that the CSV output has the correct format and content.
Steps:
- Use one of the previous test configurations.
- Run the generate command:
Terminal window subnetter generate subnet-sizes-config.json -o csv-format-output.csv - Open the CSV file and validate its contents.
Expected Results:
- CSV file should have a header row with all expected columns:
- Account Name
- VPC Name
- Cloud Provider
- Region Name
- Availability Zone
- Region CIDR
- VPC CIDR
- AZ CIDR
- Subnet CIDR
- CIDR
- Subnet Role
- Usable IPs
- Data rows should contain valid values for each column
- CIDR blocks should be valid IPv4 CIDR notation
- Usable IPs should be correctly calculated based on CIDR prefix length
No Overlapping CIDRs Test
Objective: Verify that no CIDR blocks overlap in the output.
Steps:
- Use a complex configuration with multiple accounts, regions, and subnet types.
- Run the generate command:
Terminal window subnetter generate multi-account-config.json -o overlap-test-output.csv - Analyze the output to check for overlapping CIDRs.
Expected Results:
- No CIDR blocks should overlap
- Each allocation should have a unique CIDR block
- CIDR blocks should be correctly nested according to the hierarchy (account > region > AZ > subnet)
Performance Testing
Command Execution Time Test
Objective: Measure the execution time for different configuration sizes.
Steps:
- Prepare configurations of different sizes:
- Small: 1 account, 1 region, 2 subnet types
- Medium: 5 accounts, 3 regions each, 5 subnet types
- Large: 10 accounts, 5 regions each, 10 subnet types
- Run the generate command with time measurement:
Terminal window time subnetter generate small-config.json -o small-output.csvtime subnetter generate medium-config.json -o medium-output.csvtime subnetter generate large-config.json -o large-output.csv
Expected Results:
- Execution time should scale reasonably with configuration size
- Large configurations should complete within a reasonable time (under 30 seconds)
- No memory errors or excessive CPU usage
Memory Usage Test
Objective: Monitor memory usage during execution.
Steps:
- Use a large configuration.
- Run the generate command with memory monitoring:
Terminal window /usr/bin/time -v subnetter generate large-config.json -o memory-test-output.csv
Expected Results:
- Memory usage should be reasonable (under 500MB for large configurations)
- No memory leaks or excessive growth during execution
Cross-Platform Testing
Windows Compatibility Test
Objective: Verify that the application works correctly on Windows.
Steps:
- Install the application on a Windows system.
- Run basic commands:
Terminal window subnetter --versionsubnetter generate -c minimal-config.json -o windows-output.csv
Expected Results:
- Commands should execute without errors
- Output should be identical to that on Unix-based systems
- File paths should be handled correctly
macOS Compatibility Test
Objective: Verify that the application works correctly on macOS.
Steps:
- Install the application on a macOS system.
- Run basic commands:
Terminal window subnetter --versionsubnetter generate -c minimal-config.json -o macos-output.csv
Expected Results:
- Commands should execute without errors
- Output should be identical to that on other platforms
Linux Compatibility Test
Objective: Verify that the application works correctly on Linux.
Steps:
- Install the application on a Linux system.
- Run basic commands:
Terminal window subnetter --versionsubnetter generate -c minimal-config.json -o linux-output.csv
Expected Results:
- Commands should execute without errors
- Output should be identical to that on other platforms
CLI Options Testing
Verbose Output Test
Objective: Verify that the verbose output option works correctly.
Steps:
- Run the generate command with verbose output:
Terminal window subnetter generate -c minimal-config.json -o verbose-output.csv --verbose
Expected Results:
- Command should complete without errors
- Output should include detailed logging information
- Debug-level messages should be displayed
- Internal allocation process should be visible
Custom Output File Test
Objective: Verify that custom output file paths work correctly.
Steps:
- Run the generate command with different output file paths:
Terminal window subnetter generate -c minimal-config.json -o ./output/custom-path.csvsubnetter generate -c minimal-config.json -o /tmp/absolute-path.csv
Expected Results:
- Command should complete without errors
- Output files should be created at the specified paths
- Directory structure should be created if it doesn’t exist
Help Option Test
Objective: Verify that the help option works for all commands.
Steps:
- Run help for different commands:
Terminal window subnetter --helpsubnetter generate --helpsubnetter validate --help
Expected Results:
- Help text should be displayed for each command
- Options should be clearly described
- Usage examples should be provided
Integration Testing
NPM Package Integration Test
Objective: Verify that the NPM package can be integrated into other projects.
Steps:
- Create a new Node.js project.
- Install the Subnetter package:
Terminal window npm install subnetter - Create a simple script that uses the Subnetter API:
const { generateAllocations } = require('subnetter');const config = {baseCidr: '10.0.0.0/8',cloudProviders: ['aws'],accounts: [{name: 'test-account',clouds: {aws: {regions: ['us-east-1']}}}],subnetTypes: {Public: 24,Private: 26}};const allocations = generateAllocations(config);console.log(`Generated ${allocations.length} allocations`);
- Run the script:
Terminal window node script.js
Expected Results:
- Script should execute without errors
- Allocations should be generated correctly
- API should be usable programmatically
Docker Container Test
Objective: Verify that the application works correctly in a Docker container.
Steps:
- Create a Dockerfile:
FROM node:18-alpineRUN npm install -g subnetterWORKDIR /appCOPY minimal-config.json .ENTRYPOINT ["subnetter"]
- Build and run the Docker image:
Terminal window docker build -t subnetter-test .docker run -v $(pwd):/app subnetter-test generate -c minimal-config.json -o docker-output.csv
Expected Results:
- Docker container should build and run without errors
- Output file should be created and accessible from the host
- Results should be identical to running directly on the host
Module Compatibility Testing
TypeScript Import Test
Objective: Verify that the package can be imported in TypeScript projects.
Steps:
- Create a new TypeScript project.
- Install the Subnetter package:
Terminal window npm install subnetter - Create a TypeScript file that imports and uses the package:
import { generateAllocations, Config } from 'subnetter';const config: Config = {baseCidr: '10.0.0.0/8',cloudProviders: ['aws'],accounts: [{name: 'test-account',clouds: {aws: {regions: ['us-east-1']}}}],subnetTypes: {Public: 24,Private: 26}};const allocations = generateAllocations(config);console.log(`Generated ${allocations.length} allocations`);
- Compile and run the TypeScript file:
Terminal window tsc test.tsnode test.js
Expected Results:
- TypeScript should compile without errors
- Types should be correctly defined and usable
- Script should execute without errors
ESM Import Test
Objective: Verify that the package can be imported in ESM projects.
Steps:
- Create a new ESM project.
- Install the Subnetter package:
Terminal window npm install subnetter - Create an ESM file that imports and uses the package:
import { generateAllocations } from 'subnetter';const config = {baseCidr: '10.0.0.0/8',cloudProviders: ['aws'],accounts: [{name: 'test-account',clouds: {aws: {regions: ['us-east-1']}}}],subnetTypes: {Public: 24,Private: 26}};const allocations = generateAllocations(config);console.log(`Generated ${allocations.length} allocations`);
- Run the ESM file:
Terminal window node --experimental-modules test.mjs
Expected Results:
- ESM import should work without errors
- Script should execute without errors
Regression Testing
Previous Version Compatibility Test
Objective: Verify that configurations from previous versions still work.
Steps:
- Create a configuration file using the format from a previous version:
{"baseCidr": "10.0.0.0/8","cloudProviders": ["aws"],"accounts": [{"name": "test-account","clouds": {"aws": {"regions": ["us-east-1"]}}}],"subnetTypes": {"Public": 24,"Private": 26}}
- Run the generate command:
Terminal window subnetter generate -c legacy-config.json -o legacy-output.csv
Expected Results:
- Command should complete without errors
- Legacy configuration format should be accepted and processed correctly
- Output should be identical to using the new format
Fixed Bug Verification Test
Objective: Verify that previously fixed bugs remain fixed.
Steps:
- Create configurations that would have triggered known bugs in previous versions.
- Run the generate command with these configurations.
Expected Results:
- Command should complete without errors
- Previously fixed bugs should not reappear
Test Reporting
Test Results Summary
After completing all manual tests, create a summary report with the following information:
-
Test Coverage:
- Number of tests executed
- Number of tests passed/failed
- Areas covered by testing
-
Issues Found:
- Description of any issues found
- Steps to reproduce
- Severity assessment
-
Performance Metrics:
- Execution time for different configuration sizes
- Memory usage statistics
-
Compatibility Assessment:
- Platform compatibility results
- Module integration results
-
Recommendations:
- Suggested improvements
- Areas requiring additional testing
This report will help track the application’s quality over time and identify areas for improvement.
Conclusion
This manual testing guide provides a comprehensive approach to validating the functionality, performance, and reliability of the Subnetter application. By systematically working through these test cases, you can ensure that the application meets all requirements and handles edge cases appropriately.
Remember to update this guide as new features are added or existing ones are modified to ensure continued test coverage.