Design First, Code Later: A REST Assured API Automation Framework
- Kalpana Birkodi
- Feb 12
- 4 min read
Lessons from a Hackathon — as a Beginner and a Test Lead
A first REST Assured API testing hackathon often looks straightforward at the start: automate APIs, write test cases, and help the team pass scenarios.
But in real projects, another responsibility quickly emerges Designing the automation framework for the entire team.
In this case, the role involved acting as a Test Lead while still being a beginner in REST Assured—a situation many engineers face early in their careers.
This creates a very real and practical challenge: How can an automation framework be designed so that beginners can understand it, teams can scale with it, and it still feels production-ready?
This blog walks through the design of a Cucumber BDD + REST Assured automation framework, the challenges encountered, the decisions made, and the lessons that apply to any real-world automation project.
Hackathon Context & Real Constraints
This was not a toy problem. The hackathon came with real-world complexity:
Multiple API modules (Login, User, Program, Batch, Skill)
A mix of authentication and non-authentication APIs
Login as mandatory environment setup
Every secured API dependent on a token
Parallel development across team members
Test data provided via Excel
CI/CD readiness required (Jenkins)
Very quickly, it became clear that ad-hoc automation would not scale.
The framework needed to be:
Centralized
Reusable
Easy for new contributors to adopt
Strong enough to resemble a real project, not just a hackathon solution
High-Level Framework Choices
The technology stack was intentionally kept simple and familiar:
REST Assured → API automation
Cucumber BDD → readability and collaboration
Excel (Apache POI) → data-driven testing
JUnit Runner
Allure + Extent Reports
Maven → dependency and build management
No unnecessary tools. No overengineering. Only proven, reliable choices.
One Hook to Rule Them All (Single Hook Design)
The Initial Problem
In a team setup, it is common for contributors to want:
Separate @Before hooks
Individual login logic
Module-specific setup steps
Allowing this leads to:
Duplicate code
Unpredictable execution order
Bugs that are difficult to trace
The Decision: One Global Hook
A single global Hooks.java was introduced for the entire project.
This hook handled:
REST Assured specification setup
Authentication (login API call)
Token storage
Auth vs non-auth scenario handling
Report initialization
Context cleanup after every scenario
This single decision resulted in:
Discipline across the team
Predictable execution flow
A clean and maintainable framework
Key insight: A single global hook is not a limitation—it is a leadership and design decision.
Handling Auth & Non-Auth APIs the Smart Way
Not every API requires authentication, and hardcoding auth logic everywhere quickly becomes messy.
Instead, the framework used:
Scenario tags such as @Auth and @NoAuth
Conditional logic inside the hook
Conceptually:
@Auth → token is fetched
@NoAuth → authentication is skipped
This approach kept:
Step definitions clean
Scenarios readable
The framework flexible for future modules
Note: You may use Rule functionality in feature files to separate the scenarios which requires Auth and NoAuth
Why POJO's are used
In API automation, hardcoding JSON payloads inside test steps quickly makes tests difficult to read and maintain.
To avoid this, the framework uses POJOs (Plain Old Java Objects) to represent request bodies.
Using POJOs:
Keeps payload creation clean and structured
Avoids manual JSON construction
Allows REST Assured to handle serialization automatically
POJOs also enforce a clear separation between test behavior, API logic, and test data.
When APIs change, updates are limited to a single POJO class instead of multiple test files, making maintenance simpler and safer.
Sharing Data Using a Test Context
A common automation challenge is sharing tokens, IDs, request bodies, and responses across steps and modules without relying on static variables.
This was solved using a TestContext class.
The TestContext stored:
Access tokens
API responses
Request payloads
IDs created during execution
This enabled:
Clear separation of concerns
Easier debugging
Scalable and controlled data sharing
Key insight: A shared test context is the backbone of any serious automation framework.
API Object Repository (Service Layer)
Scattering REST calls across step definitions quickly leads to unreadable tests.
To avoid this, a Service Layer was introduced—similar to the Page Object Model, but for APIs.
Each module had its own service class:
LoginService
UserService
ProgramService
BatchService
This resulted in:
Clean and readable step definitions
Localized API changes
Immediate reusability across scenarios
At this point, the framework began to feel truly professional.
Excel-Driven Testing (Done Right)
With a large number of scenarios, Excel-based test data made sense—but only with the right design.
The approach used:
One Excel file
Multiple sheets (one per module)
Dynamic column header mapping
The Excel reader returned:
List<Map<String, String>>
This allowed:
Column changes without breaking tests
Easy scalability of test data
Non-technical users to update inputs
Key insight: Excel still has value—when designed thoughtfully.
CI-Ready from Day One
Even though CI/CD execution was not mandatory, the framework was designed assuming Jenkins execution from the start.
This ensured:
Maven-driven execution
No hardcoded values
Environment configurability
Clean and reliable reporting
The result:
Smooth Jenkins execution
Effortless environment switching
Key insight: Frameworks should always be built with CI in mind—because CI is inevitable.
Team Collaboration & Scalability
The primary goal was not maximizing test count but enabling the team.
The framework focused on:
Allowing contributors to focus on scenarios
Hiding framework complexity
Enforcing conventions without friction
Because of this structure:
Parallel development worked without conflicts
Debugging was centralized
Onboarding new team members became easy
Key Takeaways
One global hook improves stability
Centralized authentication is essential
TestContext enables clean data sharing
Service layers improve maintainability
Excel scales when designed correctly
CI readiness should never be an afterthought
Final Thoughts
One lesson stands out clearly from this hackathon experience:
Framework design matters more than test count.
Thinking like a Test Architect—even as a beginner—helps teams:
Move faster
Collaborate better
Build automation that lasts beyond a single project
If you’re starting your first API automation framework: design first, code later.


