The audit finding was a wake-up call. Our CI/CD pipeline logs inadvertently exposed a long-lived database service account password. While the immediate leak was contained, the post-mortem revealed a systemic problem: the proliferation of static, long-lived credentials across development, testing, and production environments. Every .env
file, every CI variable, every developer’s machine was a potential attack vector. The mandate was clear: eliminate static database credentials entirely. This wasn’t about rotating passwords more frequently; it was about making them ephemeral, just-in-time, and automatically managed for every component of our stack, from the Laravel backend API to the Playwright E2E test suite that validates our SwiftUI mobile application’s user flows.
Our initial concept was to build a credential lifecycle management system around HashiCorp Vault. The Laravel application, upon booting, wouldn’t read credentials from a file. Instead, it would authenticate to Vault using a machine-based identity (AppRole), request a new set of database credentials with a short Time-To-Live (TTL), and inject them into its configuration at runtime. Our Playwright test suite would do something similar, but with an added twist: it would not only fetch credentials for the test database but also dynamically create ephemeral test users for each test run, achieving true test isolation and eliminating shared, static test accounts. The SwiftUI client would remain agnostic, communicating with the now-secure Laravel API, its safety guaranteed by the security of the backend it consumes.
The technology selection was a pragmatic balancing act. Laravel provided a robust, mature framework for our API, but its standard configuration system is built around static .env
files, requiring a custom integration path. HashiCorp Vault was the obvious choice for its powerful dynamic secrets engine, specifically for databases. For E2E testing, Playwright offered the modern tooling we needed to simulate complex user interactions, but its test runner lifecycle needed to be hooked into to accommodate our pre-test secret fetching routine. SwiftUI represented our native client, a critical piece of the puzzle that underscored the need for a secure backend, as client-side secret management is an anti-pattern we are determined to avoid.
Vault Configuration: The Bedrock of Dynamic Secrets
Before touching any application code, the foundation must be laid in Vault. This involves enabling the database secrets engine, configuring it to connect to our PostgreSQL database, and defining roles that Vault will use to generate dynamic credentials. A common mistake is to grant Vault’s root user excessive privileges on the database. In a real-world project, you create a dedicated Vault user with just enough permissions to manage other users and roles.
First, enable the database secrets engine:
# Enable the database secrets engine at the path 'database'
vault secrets enable database
# Configure the connection to PostgreSQL
# Connection details are passed securely
vault write database/config/postgresql \
plugin_name=postgresql-database-plugin \
allowed_roles="api-dynamic-user,e2e-dynamic-user" \
connection_url="postgresql://{{username}}:{{password}}@postgres:5432/app_db?sslmode=disable" \
username="vault_admin" \
password="vault_admin_password"
Next, we define a role for our Laravel API. This role dictates how dynamic credentials are created. The creation_statements
are critical; they grant the newly created user the minimum required permissions. The TTL values enforce the ephemeral nature of these credentials.
# Define a role for the Laravel API
vault write database/roles/api-dynamic-user \
db_name=postgresql \
creation_statements="CREATE ROLE \"{{name}}\" WITH LOGIN PASSWORD '{{password}}' VALID UNTIL '{{expiration}}'; \
GRANT SELECT, INSERT, UPDATE, DELETE ON ALL TABLES IN SCHEMA public TO \"{{name}}\";" \
default_ttl="1h" \
max_ttl="24h"
The pitfall here is overly permissive grant statements. Always adhere to the principle of least privilege. The user created for a single API instance should not have permissions to, for example, alter table schemas.
Finally, we establish authentication methods for our applications. We’ll use AppRole, which is designed for machine-to-machine authentication. One AppRole for the Laravel API, another for the CI/CD environment running Playwright.
# Enable AppRole auth method
vault auth enable approle
# Create a policy for the Laravel API
vault policy write laravel-api-policy - <<EOF
path "database/creds/api-dynamic-user" {
capabilities = ["read"]
}
EOF
# Create the AppRole for the Laravel API
vault write auth/approle/role/laravel-api \
secret_id_ttl="10m" \
token_num_uses="0" \
token_ttl="20m" \
token_max_ttl="30m" \
policies="laravel-api-policy"
# Fetch the RoleID and SecretID (to be injected into the app environment)
# In production, these would be delivered via a secure mechanism like k8s secrets
# vault read auth/approle/role/laravel-api/role-id
# vault write -f auth/approle/role/laravel-api/secret-id
The same process is repeated to create a distinct e2e-runner-policy
and e2e-runner
AppRole for the Playwright CI job, ensuring separation of concerns.
Laravel Integration: Dynamic Configuration at Boot Time
Laravel’s service container provides the perfect entry point to override the default database configuration. We’ll create a VaultServiceProvider
that executes during the application’s boot sequence. Its sole responsibility is to connect to Vault, fetch dynamic credentials, and overwrite the in-memory database configuration before any database connection is ever made.
This requires a robust client implementation that can handle authentication, credential fetching, and importantly, logging. Do not silently swallow errors during this critical startup phase.
Here is the core VaultServiceProvider.php
:
<?php
namespace App\Providers;
use Illuminate\Support\Facades\Config;
use Illuminate\Support\Facades\Http;
use Illuminate\Support\Facades\Log;
use Illuminate\Support\ServiceProvider;
use RuntimeException;
class VaultServiceProvider extends ServiceProvider
{
/**
* Bootstrap any application services.
*
* @return void
*/
public function boot(): void
{
// Only run this logic if Vault is explicitly enabled and configured.
if (!config('vault.enabled')) {
return;
}
try {
$credentials = $this->fetchDatabaseCredentials();
// Overwrite the default database configuration in memory.
// This happens before any part of the app establishes a DB connection.
Config::set('database.connections.pgsql.username', $credentials['username']);
Config::set('database.connections.pgsql.password', $credentials['password']);
// It's also critical to log the lease duration for monitoring and debugging.
Log::info('Successfully fetched dynamic DB credentials from Vault.', [
'lease_id' => $credentials['lease_id'],
'lease_duration' => $credentials['lease_duration'],
]);
} catch (\Exception $e) {
// If we cannot get credentials, the application MUST fail to start.
// Continuing with invalid credentials would lead to cascading failures.
Log::critical('Failed to fetch database credentials from Vault. Application cannot start.', [
'error' => $e->getMessage(),
]);
// This will prevent the application from serving requests.
throw new RuntimeException('Could not bootstrap application: Vault secret acquisition failed.', 0, $e);
}
}
/**
* Fetches dynamic database credentials from HashiCorp Vault.
*
* @return array
*/
protected function fetchDatabaseCredentials(): array
{
$vaultAddr = config('vault.addr');
$roleId = config('vault.role_id');
$secretId = config('vault.secret_id');
$dbRole = config('vault.database_role');
// Step 1: Authenticate with Vault using AppRole to get a client token.
$loginResponse = Http::post("{$vaultAddr}/v1/auth/approle/login", [
'role_id' => $roleId,
'secret_id' => $secretId,
]);
if (!$loginResponse->successful()) {
throw new RuntimeException('Vault AppRole login failed: ' . $loginResponse->body());
}
$clientToken = $loginResponse->json('auth.client_token');
// Step 2: Use the client token to request dynamic database credentials.
$credsResponse = Http::withToken($clientToken)
->get("{$vaultAddr}/v1/database/creds/{$dbRole}");
if (!$credsResponse->successful()) {
throw new RuntimeException('Failed to fetch dynamic DB credentials from Vault: ' . $credsResponse->body());
}
$data = $credsResponse->json();
return [
'username' => $data['data']['username'],
'password' => $data['data']['password'],
'lease_id' => $data['lease_id'],
'lease_duration' => $data['lease_duration'],
];
}
}
This provider is then registered in config/app.php
. Configuration for Vault itself is placed in a config/vault.php
file, which reads from environment variables (VAULT_ADDR
, VAULT_ROLE_ID
, VAULT_SECRET_ID
). In a production environment using Kubernetes, these variables would be securely mounted into the pod.
A crucial aspect often overlooked is lease management. The credentials fetched have a TTL. While this example focuses on fetching at boot, a production-ready implementation would also include a background process or a scheduled command to renew the lease before it expires, preventing the application from losing database access mid-operation.
Playwright E2E Test Suite: Dynamic Test Environments
Integrating dynamic secrets into the E2E test suite is where the real power of this pattern becomes apparent. We eliminate static [email protected]
accounts and hardcoded database states. Each CI run becomes a clean, isolated execution.
Playwright’s globalSetup
feature is the perfect tool for this. It’s a script that runs once before all tests. We’ll use it to interact with Vault and prepare the entire test environment.
Here is a conceptual playwright.config.ts
and the global-setup.ts
script:
// playwright.config.ts
import { defineConfig, devices } from '@playwright/test';
export default defineConfig({
testDir: './tests/e2e',
// Run global setup before all tests
globalSetup: require.resolve('./tests/e2e/global-setup'),
// Use a shared server configuration for all tests
webServer: {
command: 'php artisan serve --env=testing',
url: 'http://127.0.0.1:8000',
reuseExistingServer: !process.env.CI,
env: {
// We will populate .env.testing from our global-setup
'APP_ENV': 'testing'
}
},
use: {
baseURL: 'http://127.0.0.1:8000',
trace: 'on-first-retry',
},
projects: [
{
name: 'chromium',
use: { ...devices['Desktop Chrome'] },
},
],
});
The magic happens in global-setup.ts
. This script will use the CI runner’s AppRole credentials to talk to Vault. It performs two key actions:
- Fetches dynamic database credentials for the Laravel test server and writes them to a
.env.testing
file. - Uses a separate Vault role (or API endpoint in Laravel) to create an ephemeral test user for the E2E tests and saves the user’s credentials to a temporary file (
test-user.json
) for the tests to read.
// tests/e2e/global-setup.ts
import { FullConfig } from '@playwright/test';
import { request as httpRequest } from 'undici';
import * as fs from 'fs/promises';
import * as path from 'path';
const VAULT_ADDR = process.env.VAULT_ADDR;
const VAULT_ROLE_ID = process.env.VAULT_CI_ROLE_ID;
const VAULT_SECRET_ID = process.env.VAULT_CI_SECRET_ID;
// A dedicated Vault role for E2E database credentials
const VAULT_DB_ROLE = 'e2e-dynamic-user';
// An API endpoint to create an ephemeral test user
const LARAVEL_API_CREATE_USER_URL = 'http://127.0.0.1:8000/api/testing/create-ephemeral-user';
async function globalSetup(config: FullConfig) {
console.log('Global setup: Fetching dynamic credentials from Vault...');
if (!VAULT_ADDR || !VAULT_ROLE_ID || !VAULT_SECRET_ID) {
throw new Error('Vault CI environment variables are not set.');
}
// Step 1: Authenticate to Vault
const loginRes = await httpRequest(`${VAULT_ADDR}/v1/auth/approle/login`, {
method: 'POST',
body: JSON.stringify({
role_id: VAULT_ROLE_ID,
secret_id: VAULT_SECRET_ID,
}),
});
const loginBody = await loginRes.body.json();
const clientToken = loginBody.auth.client_token;
if (!clientToken) {
throw new Error('Failed to get Vault client token.');
}
// Step 2: Fetch dynamic database credentials for the Laravel test server
const dbCredsRes = await httpRequest(`${VAULT_ADDR}/v1/database/creds/${VAULT_DB_ROLE}`, {
method: 'GET',
headers: { 'X-Vault-Token': clientToken },
});
const dbCredsBody = await dbCredsRes.body.json();
const dbCreds = dbCredsBody.data;
if (!dbCreds || !dbCreds.username || !dbCreds.password) {
throw new Error('Failed to fetch dynamic DB credentials from Vault.');
}
// Step 3: Write credentials to a .env.testing file for the webServer process to use
const envContent = `
DB_CONNECTION=pgsql
DB_HOST=postgres
DB_PORT=5432
DB_DATABASE=app_db
DB_USERNAME=${dbCreds.username}
DB_PASSWORD=${dbCreds.password}
`;
await fs.writeFile('.env.testing', envContent);
console.log('Successfully wrote dynamic DB credentials to .env.testing');
// Step 4: Use a backend endpoint to create an ephemeral application user for tests.
// This is better than having Vault manage application users directly.
// The endpoint itself is protected and should only be available in the testing environment.
// The backend, now using its own dynamic DB credentials, will create this user.
// NOTE: This part requires the Laravel server to be running.
// In a real CI pipeline, you might start the server, call this, then proceed with tests.
// For simplicity here, we assume a mechanism exists to create this user.
// A more robust approach might be a CLI command `php artisan test:create-user`.
const testUser = {
email: `test-user-${Date.now()}@example.com`,
password: `P@ssw0rd!${Math.random()}`
};
// In a real scenario, you'd call the API here to create the user.
// For this example, we'll just write the generated credentials.
await fs.writeFile(path.join(__dirname, 'test-user.json'), JSON.stringify(testUser));
console.log(`Ephemeral test user credentials saved for user: ${testUser.email}`);
}
export default globalSetup;
Now, a sample test can read the ephemeral user’s details and proceed with the test run.
// tests/e2e/auth.spec.ts
import { test, expect } from '@playwright/test';
import * as fs from 'fs';
import * as path from 'path';
// Read the dynamically created user credentials
const testUser = JSON.parse(fs.readFileSync(path.join(__dirname, 'test-user.json'), 'utf-8'));
test('user can log in and access a protected route', async ({ page }) => {
await page.goto('/login');
await page.getByLabel('Email').fill(testUser.email);
await page.getByLabel('Password').fill(testUser.password);
await page.getByRole('button', { name: 'Log In' }).click();
await expect(page).toHaveURL('/dashboard');
await expect(page.getByText(`Welcome, ${testUser.email}`)).toBeVisible();
});
This setup completely decouples the test suite from static user data and database states.
SwiftUI Client Architectural Considerations
The SwiftUI application remains blissfully unaware of Vault. Its contract is with the Laravel API. This separation is a core security principle: a mobile client should never have credentials to infrastructure components like a secrets manager.
The client’s security responsibility is to handle user authentication against the API securely. This typically involves using a robust authentication mechanism like OAuth 2.0 (managed by Laravel Passport) or token-based authentication (Laravel Sanctum). Secure storage of authentication tokens on the device (e.g., using the iOS Keychain) is paramount.
Here’s a conceptual network service in Swift demonstrating how it would interact with the API, without any knowledge of the backend’s dynamic secret infrastructure.
import Foundation
// A simplified networking service to interact with the Laravel API
class ApiService {
static let shared = ApiService()
private let baseURL = URL(string: "http://127.0.0.1:8000/api")!
// In a real app, this token would be securely stored in the Keychain
private var authToken: String?
enum ApiError: Error {
case invalidURL
case requestFailed(Error)
case decodingError
case unauthorized
case serverError(statusCode: Int)
}
func login(credentials: LoginCredentials, completion: @escaping (Result<Bool, ApiError>) -> Void) {
let endpoint = baseURL.appendingPathComponent("login")
var request = URLRequest(url: endpoint)
request.httpMethod = "POST"
request.setValue("application/json", forHTTPHeaderField: "Content-Type")
// In a real app, you would use Codable for this
let body = try? JSONEncoder().encode(credentials)
request.httpBody = body
URLSession.shared.dataTask(with: request) { data, response, error in
// ... handle response, extract token, store in Keychain ...
// On success, set self.authToken and call completion(Result.success(true))
}.resume()
}
func fetchProtectedData(completion: @escaping (Result<ProtectedData, ApiError>) -> Void) {
guard let token = self.authToken else {
completion(.failure(.unauthorized))
return
}
let endpoint = baseURL.appendingPathComponent("user/profile")
var request = URLRequest(url: endpoint)
request.setValue("Bearer \(token)", forHTTPHeaderField: "Authorization")
URLSession.shared.dataTask(with: request) { data, response, error in
// ... handle response, decode JSON, call completion ...
}.resume()
}
}
// Example data structures
struct LoginCredentials: Codable { /* ... */ }
struct ProtectedData: Codable { /* ... */ }
The key takeaway is that by securing the backend with Vault, we inherently protect the mobile client that depends on it. The integrity of the data and business logic served to the SwiftUI app is now backed by a system that doesn’t rely on fragile, long-lived secrets.
graph TD subgraph "CI Environment (Playwright Runner)" A[global-setup.ts] -- 1. Auth with AppRole --> V[HashiCorp Vault]; V -- 2. Issues DB creds --> A; A -- 3. Writes .env.testing --> L_Test[Laravel Test Server]; A -- 4. Creates Test User via API --> L_Test; PT[Playwright Tests] -- 5. Runs against --> L_Test; end subgraph "Application Environment" S[SwiftUI Client] -- 6. API Calls --> L_Prod[Laravel API]; L_Prod -- 7. On Boot, Auth with AppRole --> V; V -- 8. Issues DB creds --> L_Prod; L_Prod -- 9. Connects to --> DB[(PostgreSQL)]; end subgraph "Secrets & Data" V -- Manages Roles/Leases --> DB; end style V fill:#f9f,stroke:#333,stroke-width:2px style DB fill:#cff,stroke:#333,stroke-width:2px
This architecture is certainly more complex than a simple .env
file approach. The operational overhead of maintaining a Vault cluster is non-trivial, and it introduces a new, critical dependency. If Vault is down, no service can start, and no test can run. This trade-off of complexity for security is not for every project. For applications handling sensitive data or operating in regulated environments, however, it shifts from a “nice-to-have” to a fundamental requirement.
A remaining challenge is the robust handling of lease renewals within a long-running Laravel process like a queue worker. A naive implementation could fail if a lease expires between renewal checks. A more advanced solution might involve a dedicated renewal sidecar or a more sophisticated client library that handles renewals transparently in the background. Furthermore, this model could be extended beyond database credentials to manage API keys for third-party services, further reducing the static secret footprint of the entire application.