The conventional approach of storing JWTs or API keys in localStorage
for Single-Page Applications is a well-documented XSS vulnerability. Any third-party script compromise can lead to token exfiltration. While short-lived tokens and refresh mechanisms mitigate this, the core problem remains: the application’s main thread has direct access to sensitive credentials. Our objective was to completely isolate credentials from the main browser context, manage their lifecycle dynamically, and provide clear user feedback on the security state, all without introducing significant backend complexity.
The initial pain point was a large-scale internal dashboard that needed to query multiple databases directly for real-time reporting. Storing long-lived database credentials on the server backing the dashboard was a non-starter, and passing them to the client was unthinkable. The requirement was for ephemeral, per-session, per-user database credentials. This led us to HashiCorp Vault’s Database Secrets Engine, but the question of how to deliver these short-lived credentials securely to the client persisted.
Our first concept involved a backend-for-frontend (BFF) that would proxy all data requests, attaching the credentials on the server-side. This is a robust pattern, but it introduces another stateful service to maintain, adds a network hop, and complicates our deployment topology. We sought a solution that could shift this proxying logic to the client-side but within a secure, isolated boundary. This is where the Service Worker became the cornerstone of our architecture. A Service Worker acts as a network proxy in the browser, but crucially, it runs in a separate thread with a distinct global scope, inaccessible to the DOM. It can intercept network requests, modify them, and manage its own state, making it an ideal candidate for a client-side “credential vault.”
Technology Selection Rationale
In a real-world project, technology choices are driven by constraints and requirements, not just novelty.
HashiCorp Vault: This was a foundational requirement. Its ability to generate dynamic, time-bound database credentials on-demand is precisely what we needed. Using its AppRole authentication method allows our backend API to securely authenticate itself to Vault without hardcoded tokens.
Node.js/Express RESTful API: We needed a lightweight, stateless intermediary to bridge the Service Worker and Vault. A simple Node.js application is perfect for this. Its sole responsibilities are to authenticate to Vault, request a new database secret, and pass it back to the Service Worker. It never needs to talk to the database itself, minimizing its attack surface.
Service Worker: The core of the client-side security model. It intercepts outgoing
fetch
requests from the main application, holds the ephemeral database credentials in its own memory scope, injects them as headers, and manages the renewal process just before the credential’s lease expires. This achieves total isolation of the credential from the React application code.Styled-components: A security mechanism that is invisible to the user is often a source of confusion. We needed a way to reflect the current security context in the UI. Is the connection secure? Is the credential about to expire? Has the connection been lost?
Styled-components
allows us to create dynamic, prop-driven UI components that can change their appearance based on status messages received from the Service Worker, providing essential, non-intrusive feedback.
The Architectural Flow
The entire process operates as a chain of secure handoffs.
sequenceDiagram participant ReactApp as React App (Main Thread) participant SW as Service Worker participant API as Node.js API participant Vault participant DB as Target Database ReactApp->>SW: fetch('/api/protected-data') Note over SW: Intercepts fetch request SW->>SW: Check for cached credential alt Credential missing or expired SW->>API: POST /api/vault/db-creds API->>Vault: Authenticate (AppRole) & Request Secret Vault-->>API: Dynamic DB Username/Password (Lease TTL: 300s) API-->>SW: Returns credential & lease duration Note over SW: Caches credential in its own scope end SW->>ReactApp: (Request is held) Note over SW: Clones original request SW->>SW: Injects 'Authorization' header with DB creds SW->>DB: Forward modified request DB-->>SW: Returns protected data SW-->>ReactApp: Responds with data
Implementation: Vault and the Backend API
First, we configure Vault. This assumes Vault is already running. We enable the database secrets engine and configure it for a PostgreSQL database.
Vault Configuration (CLI):
# Enable the database secrets engine
vault secrets enable database
# Configure the database connection.
# In production, these values would come from a secure source.
vault write database/config/my-postgres \
plugin_name=postgresql-database-plugin \
allowed_roles="readonly-role" \
connection_url="postgresql://{{username}}:{{password}}@postgres:5432/dashboard?sslmode=disable" \
username="postgres_admin" \
password="admin_password"
# Create a role that defines how credentials are generated.
# This SQL creates a temporary role in Postgres for each lease.
vault write database/roles/readonly-role \
db_name=my-postgres \
creation_statements="CREATE ROLE \"{{name}}\" WITH LOGIN PASSWORD '{{password}}' VALID UNTIL '{{expiration}}'; \
GRANT SELECT ON ALL TABLES IN SCHEMA public TO \"{{name}}\";" \
default_ttl="5m" \
max_ttl="15m"
# Set up AppRole for the backend API to authenticate
vault auth enable approle
vault write auth/approle/role/api-role \
secret_id_ttl=10m \
token_num_uses=10 \
token_ttl=20m \
token_max_ttl=30m \
secret_id_num_uses=40
# Get the RoleID and a SecretID for the Node.js API
vault read auth/approle/role/api-role/role-id
# OUPUT: role_id: 1234...
vault write -f auth/approle/role/api-role/secret-id
# OUPUT: secret_id: 5678...
# These values will be set as environment variables for the Node API.
Next, the Node.js API that the Service Worker will call. This is a minimal Express server.
server.js
const express = require('express');
const vault = require('node-vault');
const morgan = require('morgan');
const app = express();
const port = 3001;
// Basic configuration - pull from environment variables in production
const VAULT_ADDR = process.env.VAULT_ADDR || 'http://127.0.0.1:8200';
const VAULT_ROLE_ID = process.env.VAULT_ROLE_ID;
const VAULT_SECRET_ID = process.env.VAULT_SECRET_ID;
if (!VAULT_ROLE_ID || !VAULT_SECRET_ID) {
console.error("FATAL: VAULT_ROLE_ID and VAULT_SECRET_ID must be set.");
process.exit(1);
}
const vaultOptions = {
apiVersion: 'v1',
endpoint: VAULT_ADDR,
};
const vaultClient = vault(vaultOptions);
// Centralized logging for requests
app.use(morgan('combined'));
app.use(express.json());
// A simple in-memory cache for the Vault token to avoid re-authenticating on every call
let vaultTokenCache = {
token: null,
expires: 0,
};
async function getVaultToken() {
if (vaultTokenCache.token && Date.now() < vaultTokenCache.expires) {
return vaultTokenCache.token;
}
console.log('Vault token expired or not present. Re-authenticating with AppRole.');
try {
const result = await vaultClient.approleLogin({
role_id: VAULT_ROLE_ID,
secret_id: VAULT_SECRET_ID,
});
const token = result.auth.client_token;
const ttl = result.auth.lease_duration; // in seconds
vaultTokenCache = {
token: token,
expires: Date.now() + (ttl * 1000 * 0.9), // Renew at 90% of TTL
};
vaultClient.token = token; // Set the token for subsequent requests
return token;
} catch (err) {
console.error('AppRole authentication failed:', err);
throw new Error('Could not authenticate with Vault.');
}
}
app.post('/api/vault/db-creds', async (req, res) => {
try {
await getVaultToken();
const dbCreds = await vaultClient.read('database/creds/readonly-role');
// A crucial step: we are not just sending the credentials, but also
// the lease duration. The Service Worker needs this to manage the lifecycle.
res.json({
username: dbCreds.data.username,
password: dbCreds.data.password,
lease_duration: dbCreds.lease_duration, // in seconds
});
} catch (error) {
console.error('Failed to fetch DB credentials from Vault:', error.message);
// Do not leak internal error details to the client
res.status(500).json({ error: 'Internal Server Error' });
}
});
// Mock protected data endpoint for demonstration purposes.
// In a real scenario, this would be the actual database service.
app.get('/api/protected-data', (req, res) => {
// In a real system, this endpoint would validate the provided database credentials
// before returning data. Here we just simulate a successful response.
const authHeader = req.headers['authorization'];
if (!authHeader) {
return res.status(401).json({ error: 'Authorization header missing' });
}
console.log('Received request with Authorization header.');
res.json({
data: `Sensitive data accessed at ${new Date().toISOString()}`
});
});
app.listen(port, () => {
console.log(`Vault-backed API listening on port ${port}`);
});
A common mistake here is to perform the approleLogin
on every single request. This is inefficient and can hit API rate limits on Vault. The simple in-memory cache for the Vault client token within the Node.js API is a pragmatic optimization.
Implementation: The Service Worker Proxy
This is the most critical piece. The Service Worker must be robust. It needs to handle the install
, activate
, and fetch
events.
public/service-worker.js
// A simple in-memory store for our dynamic credential.
// This is the key: this variable is INACCESSIBLE from the main document context.
let dbCredential = null;
let credentialLeaseTimeout = null;
// The Service Worker installation
self.addEventListener('install', event => {
console.log('Service Worker installing.');
// Force the waiting service worker to become the active service worker.
self.skipWaiting();
});
self.addEventListener('activate', event => {
console.log('Service Worker activating.');
// Take control of all clients as soon as activated.
event.waitUntil(self.clients.claim());
});
// Intercept fetch requests
self.addEventListener('fetch', event => {
const url = new URL(event.request.url);
// Only intercept requests to our protected data API
if (url.pathname.startsWith('/api/protected-data')) {
// respondWith() takes a promise. We create one that resolves with the final response.
event.respondWith(handleProtectedRequest(event.request));
}
});
async function handleProtectedRequest(request) {
try {
// Ensure we have a valid credential before proceeding.
const credential = await getValidCredential();
// Clone the request to modify it. Requests are streams and can only be consumed once.
const newRequest = new Request(request.url, {
method: request.method,
headers: request.headers,
body: request.body,
mode: 'same-origin', // important for CORS
credentials: request.credentials,
redirect: request.redirect,
});
// The core logic: Inject the credential into the Authorization header.
// We use Basic Auth format for this example.
const basicAuth = btoa(`${credential.username}:${credential.password}`);
newRequest.headers.set('Authorization', `Basic ${basicAuth}`);
// Send the modified request to the network.
return await fetch(newRequest);
} catch (error) {
console.error('Service Worker: Failed to handle protected request:', error);
// Notify the client that the security context is broken
notifyClients('UNSECURED');
// Return an error response to the original fetch call in the app
return new Response(JSON.stringify({ error: 'Failed to apply credentials' }), {
status: 500,
headers: { 'Content-Type': 'application/json' },
});
}
}
async function getValidCredential() {
// If we have a credential and it's not expiring soon, return it.
if (dbCredential) {
return dbCredential;
}
console.log('Service Worker: No valid credential. Fetching a new one.');
return await fetchAndCacheCredential();
}
async function fetchAndCacheCredential() {
try {
const response = await fetch('/api/vault/db-creds', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
}
});
if (!response.ok) {
throw new Error(`Server responded with status: ${response.status}`);
}
const data = await response.json();
dbCredential = {
username: data.username,
password: data.password,
};
// Clear any previous renewal timer
if (credentialLeaseTimeout) {
clearTimeout(credentialLeaseTimeout);
}
// The lease_duration from our API is critical for proactive renewal.
// We'll renew at 80% of the lease time to avoid race conditions.
const renewalTime = data.lease_duration * 1000 * 0.8;
console.log(`Service Worker: Credential obtained. Will renew in ${renewalTime / 1000} seconds.`);
credentialLeaseTimeout = setTimeout(() => {
console.log('Service Worker: Credential lease expiring. Proactively renewing.');
dbCredential = null; // Invalidate the current credential
credentialLeaseTimeout = null;
// The next call to getValidCredential() will trigger a new fetch.
notifyClients('EXPIRING');
}, renewalTime);
// Notify clients that the connection is now secure.
notifyClients('SECURE');
return dbCredential;
} catch (error) {
console.error('Service Worker: Failed to fetch new credential:', error);
dbCredential = null; // Ensure we don't use stale data
if (credentialLeaseTimeout) clearTimeout(credentialLeaseTimeout);
throw error; // Propagate error to the fetch handler
}
}
// Helper function to send status messages to all client pages.
async function notifyClients(status) {
const allClients = await self.clients.matchAll({
includeUncontrolled: true,
type: 'window',
});
for (const client of allClients) {
client.postMessage({ type: 'SECURITY_STATUS', status });
}
}
The pitfall here is lifecycle management. Simply fetching a token isn’t enough. The proactive renewal using setTimeout
based on the lease duration provided by the API is what makes this a robust, production-ready solution. If the worker just waited for a request to fail before renewing, the user would experience intermittent errors.
Implementation: Frontend Integration with React and Styled-components
Finally, we tie it all together in the React application.
src/index.js
(Registering the Service Worker)
import React from 'react';
import ReactDOM from 'react-dom/client';
import App from './App';
// Register the Service Worker
if ('serviceWorker' in navigator) {
window.addEventListener('load', () => {
navigator.serviceWorker.register('/service-worker.js')
.then(registration => {
console.log('ServiceWorker registration successful with scope: ', registration.scope);
})
.catch(error => {
console.log('ServiceWorker registration failed: ', error);
});
});
}
const root = ReactDOM.createRoot(document.getElementById('root'));
root.render(
<React.StrictMode>
<App />
</React.StrictMode>
);
src/App.js
(The main application component)
import React, { useState, useEffect } from 'react';
import styled from 'styled-components';
// This is our dynamic UI indicator.
const SecurityStatusIndicator = styled.div`
padding: 10px 15px;
margin-bottom: 20px;
border-radius: 4px;
font-weight: bold;
text-align: center;
transition: background-color 0.3s ease, color 0.3s ease;
// Use props to dynamically change styles
background-color: ${({ status }) => {
switch (status) {
case 'SECURE':
return '#28a745'; // Green
case 'EXPIRING':
return '#ffc107'; // Yellow
case 'UNSECURED':
return '#dc3545'; // Red
default:
return '#6c757d'; // Gray
}
}};
color: ${({ status }) => (status === 'EXPIRING' ? '#212529' : '#fff')};
`;
function App() {
const [data, setData] = useState(null);
const [error, setError] = useState('');
const [securityStatus, setSecurityStatus] = useState('INITIALIZING');
// Effect to listen for messages from the Service Worker
useEffect(() => {
const handleMessage = (event) => {
if (event.data && event.data.type === 'SECURITY_STATUS') {
console.log(`Received status from SW: ${event.data.status}`);
setSecurityStatus(event.data.status);
}
};
navigator.serviceWorker.addEventListener('message', handleMessage);
// Cleanup
return () => {
navigator.serviceWorker.removeEventListener('message', handleMessage);
};
}, []);
const fetchData = async () => {
setError('');
setData(null);
try {
// The application code is clean. It has no knowledge of tokens or credentials.
// It just makes a simple fetch call. The SW handles the rest.
const response = await fetch('/api/protected-data');
if (!response.ok) {
// The error could be from the DB or from the SW failing to inject creds.
// The SW logs will have the specific details.
const errData = await response.json();
throw new Error(errData.error || `HTTP error! status: ${response.status}`);
}
const result = await response.json();
setData(result.data);
} catch (e) {
console.error("Fetch failed:", e);
setError(e.message);
}
};
const getStatusText = (status) => {
switch (status) {
case 'SECURE':
return 'Connection Secure: Credentials are active and injected.';
case 'EXPIRING':
return 'Connection Warning: Credentials are being renewed.';
case 'UNSECURED':
return 'Connection Insecure: Could not obtain credentials.';
default:
return 'Connection Status: Initializing...';
}
}
return (
<div style={{ padding: '2rem', fontFamily: 'sans-serif' }}>
<h1>Client-Side Secret Injection Demo</h1>
<SecurityStatusIndicator status={securityStatus}>
{getStatusText(securityStatus)}
</SecurityStatusIndicator>
<button onClick={fetchData}>Fetch Protected Data</button>
{data && <pre style={{ background: '#f0f0f0', padding: '1rem', marginTop: '1rem' }}>{JSON.stringify({ data }, null, 2)}</pre>}
{error && <p style={{ color: 'red', marginTop: '1rem' }}>Error: {error}</p>}
</div>
);
}
export default App;
This completes the loop. The React application is blissfully unaware of the complex credential lifecycle management happening in the background. It simply makes a network request and receives data or an error. The SecurityStatusIndicator
provides the necessary feedback loop to the user, driven by messages from the Service Worker that actually manages the state.
Limitations and Future Considerations
This architecture, while providing strong security isolation on the client-side, is not without its trade-offs. The state (the credential) stored in the Service Worker’s global scope is ephemeral. If the browser terminates the Service Worker process due to inactivity, the credential is lost. The next fetch
request will incur the latency of a full re-authentication cycle with the backend API and Vault. For applications requiring lower latency on the first request after a period of inactivity, one might consider encrypting the credential and storing it in IndexedDB, but this introduces the significant complexity of key management.
Furthermore, debugging Service Workers can be challenging. The separation from the main thread means developers must be comfortable with the browser’s developer tools for inspecting worker state, network interception, and life cycles. The logic within the worker, especially error handling for network failures or Vault unavailability, must be exceptionally robust to prevent the entire application from becoming non-functional. The additional complexity must be justified by the security requirements of the application.