Constructing a Secure Credential Vending Service with Vault for Direct DynamoDB Access from a Solid.js Frontend


The API gateway was becoming a chokepoint. Every single interaction from our highly reactive Solid.js frontend—fetching user profile details, updating a setting, saving a draft—was being funneled through a central Go service. This service would authenticate the request, perform business logic, and then issue a corresponding query to DynamoDB. For operations requiring complex validation, this pattern is sound. But for simple, high-frequency CRUD on user-isolated data, it introduced unnecessary latency and placed a significant, linear load on our API layer. The challenge was to offload this traffic without compromising our security model. Embedding long-lived AWS credentials in the frontend is a non-starter. The solution required a mechanism to mint temporary, narrowly-scoped credentials on-demand and deliver them to the client.

Our initial concept revolved around a Token Vending Machine (TVM). The frontend would authenticate with our Go backend, which would then assume an IAM role and generate temporary credentials via AWS STS. This is a viable but clunky approach. It would require us to manage the IAM roles and trust policies ourselves, and the logic for generating scoped-down policies on the fly can become complex and error-prone. This is where HashiCorp Vault entered the picture. Vault’s AWS Secrets Engine is purpose-built for this exact scenario: dynamically generating credentials based on pre-defined IAM policies. It abstracts away the direct interaction with AWS IAM for user creation and provides a clean, auditable API for credential generation.

The final architectural decision was a three-way integration:

  1. HashiCorp Vault: Configured with an AWS Secrets Engine role that uses a templated IAM policy. This policy would dynamically inject the authenticated user’s ID into the DynamoDB resource condition, ensuring a client could only ever access its own data partition.
  2. Go Backend Service: A lightweight API that authenticates the user (e.g., via a JWT) and acts as a trusted intermediary. It requests the dynamic AWS credentials from Vault on behalf of the authenticated user and returns them to the frontend.
  3. Solid.js Frontend: Responsible for managing the lifecycle of these short-lived credentials. It would fetch them from the Go service, instantiate the AWS SDK, and handle the logic for proactively refreshing them before they expire.

This design shifts the data plane directly from the client to DynamoDB while keeping the control plane (authentication and credential issuance) secure on the backend.

sequenceDiagram
    participant Client as Solid.js Client
    participant Backend as Go Credential Service
    participant Vault as HashiCorp Vault
    participant AWS_IAM as AWS IAM
    participant AWS_DB as DynamoDB

    Client->>+Backend: POST /api/v1/sts/dynamo-creds (with Auth JWT)
    Backend->>Backend: Validate JWT, extract UserID
    Backend->>+Vault: Request credentials from 'dynamodb-user-role' for UserID
    Vault->>+AWS_IAM: Create temporary IAM User/Role
    AWS_IAM-->>-Vault: Generate AccessKey, SecretKey, SessionToken
    Vault-->>-Backend: Return leased credentials with TTL
    Backend-->>-Client: Forward temporary AWS credentials

    Note over Client: Instantiate AWS SDK with temporary creds

    Client->>+AWS_DB: PutItem/GetItem (scoped to UserID partition)
    AWS_DB-->>-Client: Operation successful

Vault Configuration: The Foundation of Security

Everything starts with a correctly configured Vault instance. For development, a simple vault server -dev is sufficient. The critical part is configuring the AWS secrets engine to create users bound to a specific policy template.

First, enable the AWS secrets engine and configure it with root credentials. In a production environment, this would be a dedicated IAM user with permissions limited to creating other users and managing policies.

# Start Vault dev server (for local testing)
# vault server -dev

# Set Vault address and root token
export VAULT_ADDR='http://127.0.0.1:8200'
export VAULT_TOKEN='root'

# Enable AWS secrets engine
vault secrets enable aws

# Configure the secrets engine with root credentials
# In production, use a dedicated IAM user with restricted permissions
vault write aws/config/root \
    access_key="YOUR_AWS_ACCESS_KEY_ID" \
    secret_key="YOUR_AWS_SECRET_ACCESS_KEY" \
    region="us-east-1"

The core of the security model lies in the Vault role and its associated IAM policy. We define a role, let’s call it dynamodb-user-role, that Vault will use to generate credentials. The credential_type is set to iam_user, meaning Vault will create an actual IAM user for each lease, which is cleaned up automatically when the lease expires.

The most important piece is the policy_document, which is provided as a string containing a templated JSON policy. A common mistake here is to create a static policy. The power of Vault is its ability to render these templates. We use Handlebars syntax ({{}}`) to define placeholders. Here is the policy document, which we'll save as `iam_policy.json.hcl`. It grants access only to specific DynamoDB actions and, crucially, uses an IAM condition key `dynamodb:LeadingKeys`. This ensures the user can only operate on items where the partition key matches their own user ID. This user ID will be supplied by our Go backend during the credential request. **`iam_policy.json.hcl`**

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "dynamodb:GetItem",
        "dynamodb:PutItem",
        "dynamodb:UpdateItem",
        "dynamodb:DeleteItem",
        "dynamodb:Query"
      ],
      "Resource": "arn:aws:dynamodb:us-east-1:YOUR_ACCOUNT_ID:table/user-data",
      "Condition": {
        "ForAllValues:StringEquals": {
          "dynamodb:LeadingKeys": [
            "{{identity.entity.metadata.user_id}}"
          ]
        }
      }
    }
  ]
}
A pitfall here is understanding how Vault passes metadata. We are using Vault's Identity system. Our Go application will authenticate to Vault and create (or use an existing) entity representing the end-user. We'll attach metadata (`user_id`) to this entity. The policy template then accesses this metadata via `{{identity.entity.metadata.user_id}}. This is a secure way to pass contextual data into the policy.

Now, we create the role in Vault, referencing this policy document and setting a short default lease time-to-live (TTL). A 5-minute TTL is a good starting point—long enough to be useful, short enough to limit exposure significantly.

# Create the Vault role with the templated policy
vault write aws/roles/dynamodb-user-role \
    credential_type="iam_user" \
    policy_document=@iam_policy.json.hcl \
    default_sttl="5m" \
    max_sttl="10m"

With this configuration, any request to aws/creds/dynamodb-user-role will trigger Vault to generate a new IAM user with a policy dynamically scoped to the user_id provided in the request context.

The Go Backend: A Secure Intermediary

The Go service is the trusted component that communicates with Vault. It must not be publicly exposed without an authentication layer. We’ll use a simple Gin server for this example, assuming a middleware has already validated a JWT and attached the user’s ID to the request context.

The project structure is minimal:

credential-vender/
├── go.mod
├── go.sum
└── main.go

The core logic resides in main.go. We’ll use the official HashiCorp Vault client library. The configuration for the Vault address and token should be loaded from environment variables, not hardcoded.

main.go

package main

import (
	"context"
	"fmt"
	"log"
	"net/http"
	"os"
	"time"

	"github.com/gin-contrib/cors"
	"github.com/gin-gonic/gin"
	"github.com/hashicorp/vault/api"
	auth "github.com/hashicorp/vault/api/auth/approle"
)

// Config holds application configuration
type Config struct {
	VaultAddr    string
	VaultRoleID  string
	VaultSecretID string
	ListenAddr   string
}

// loadConfig loads configuration from environment variables
func loadConfig() (*Config, error) {
	cfg := &Config{
		VaultAddr:    os.Getenv("VAULT_ADDR"),
		VaultRoleID:  os.Getenv("VAULT_ROLE_ID"),
		VaultSecretID: os.Getenv("VAULT_SECRET_ID"),
		ListenAddr:   ":8080",
	}
	if cfg.VaultAddr == "" {
		return nil, fmt.Errorf("VAULT_ADDR environment variable not set")
	}
    // In a real project, RoleID and SecretID are required for AppRole auth.
    // For this example, we will fallback to using the root token if they are not set.
	return cfg, nil
}

// newVaultClient creates and authenticates a new Vault client.
// In production, AppRole auth is strongly recommended over tokens.
func newVaultClient(cfg *Config) (*api.Client, error) {
	config := api.DefaultConfig()
	config.Address = cfg.VaultAddr

	client, err := api.NewClient(config)
	if err != nil {
		return nil, fmt.Errorf("failed to create vault client: %w", err)
	}

	// For production, use AppRole instead of a root token.
	// This example shows both for completeness.
	if cfg.VaultRoleID != "" && cfg.VaultSecretID != "" {
		appRoleAuth, err := auth.NewAppRoleAuth(
			cfg.VaultRoleID,
			&auth.SecretID{FromString: cfg.VaultSecretID},
		)
		if err != nil {
			return nil, fmt.Errorf("failed to create approle auth: %w", err)
		}
		authInfo, err := client.Auth().Login(context.Background(), appRoleAuth)
		if err != nil {
			return nil, fmt.Errorf("failed to login with approle: %w", err)
		}
		if authInfo == nil {
			return nil, fmt.Errorf("no auth info was returned after login")
		}
		log.Println("Successfully authenticated to Vault with AppRole.")
	} else {
        // Fallback to token from environment for local dev
		token := os.Getenv("VAULT_TOKEN")
		if token == "" {
			return nil, fmt.Errorf("VAULT_TOKEN must be set if not using AppRole")
		}
		client.SetToken(token)
		log.Println("Using VAULT_TOKEN for authentication.")
	}

	return client, nil
}

// authMiddleware is a placeholder for a real JWT validation middleware.
func authMiddleware() gin.HandlerFunc {
	return func(c *gin.Context) {
		// In a real application, you would validate a JWT here.
		// For this example, we'll extract a user ID from a header.
		userID := c.GetHeader("X-User-ID")
		if userID == "" {
			c.AbortWithStatusJSON(http.StatusUnauthorized, gin.H{"error": "Unauthorized: Missing X-User-ID header"})
			return
		}
		c.Set("userID", userID)
		c.Next()
	}
}

func main() {
	cfg, err := loadConfig()
	if err != nil {
		log.Fatalf("Failed to load configuration: %v", err)
	}

	vaultClient, err := newVaultClient(cfg)
	if err != nil {
		log.Fatalf("Failed to create Vault client: %v", err)
	}

	r := gin.Default()

	// Configure CORS for frontend access
	r.Use(cors.New(cors.Config{
		AllowOrigins:     []string{"http://localhost:3000"}, // Your Solid.js dev server
		AllowMethods:     []string{"GET", "POST"},
		AllowHeaders:     []string{"Origin", "Content-Type", "X-User-ID"},
		ExposeHeaders:    []string{"Content-Length"},
		AllowCredentials: true,
		MaxAge:           12 * time.Hour,
	}))

	apiV1 := r.Group("/api/v1")
	apiV1.Use(authMiddleware())
	{
		apiV1.POST("/sts/dynamo-creds", handleGetDynamoDBCreds(vaultClient))
	}

	log.Printf("Starting server on %s", cfg.ListenAddr)
	if err := r.Run(cfg.ListenAddr); err != nil {
		log.Fatalf("Failed to start server: %v", err)
	}
}

// handleGetDynamoDBCreds is the handler for vending credentials.
func handleGetDynamoDBCreds(client *api.Client) gin.HandlerFunc {
	return func(c *gin.Context) {
		userID, exists := c.Get("userID")
		if !exists {
			c.JSON(http.StatusInternalServerError, gin.H{"error": "User ID not found in context"})
			return
		}

		// This part is crucial for linking the request to the policy template.
		// We create a Vault entity and an alias to represent the end-user.
		// This is idempotent; if they exist, Vault does nothing.
		entityAlias, entityID, err := createOrUpdateVaultEntity(client, userID.(string))
		if err != nil {
			log.Printf("Error creating/updating Vault entity for user %s: %v", userID, err)
			c.JSON(http.StatusInternalServerError, gin.H{"error": "Failed to setup identity in Vault"})
			return
		}

		log.Printf("Using entity '%s' for alias '%s'", entityID, entityAlias.Name)
		
		// When requesting credentials, we must provide the entity alias so Vault
		// can link this request to the entity and its metadata.
		secret, err := client.Logical().Write(
			"aws/creds/dynamodb-user-role",
			map[string]interface{}{
				"role_arn": "", // Not needed for iam_user
				"entity_alias": entityAlias.Name, // This is how Vault gets the context
			},
		)

		if err != nil {
			log.Printf("Error getting credentials from Vault: %v", err)
			c.JSON(http.StatusInternalServerError, gin.H{"error": "Failed to generate AWS credentials"})
			return
		}

		c.JSON(http.StatusOK, gin.H{
			"access_key":  secret.Data["access_key"],
			"secret_key":  secret.Data["secret_key"],
			"session_token": secret.Data["security_token"], // The key from Vault is 'security_token'
			"lease_duration": secret.LeaseDuration,
		})
	}
}

// createOrUpdateVaultEntity creates a Vault Identity Entity and an Alias for a given user ID.
func createOrUpdateVaultEntity(client *api.Client, userID string) (*api.EntityAlias, string, error) {
    // We need to get the accessor for the auth method our service uses. Let's assume AppRole.
    // In dev with root token, we can use the default 'token' accessor.
    authList, err := client.Sys().ListAuth()
    if err != nil {
        return nil, "", err
    }
    
    // Using token auth accessor for simplicity in dev
    accessor, ok := authList["token/"].Accessor
    if !ok {
        return nil, "", fmt.Errorf("token auth mount not found")
    }

    // Create or update the entity
    entityReq := map[string]interface{}{
        "name":     fmt.Sprintf("user-%s", userID),
        "metadata": map[string]string{"user_id": userID},
        "policies": []string{}, // No additional Vault policies needed for the entity itself
    }
    entityResp, err := client.Logical().Write(fmt.Sprintf("identity/entity/name/user-%s", userID), entityReq)
    if err != nil {
        return nil, "", fmt.Errorf("failed to write entity: %w", err)
    }
    entityID := entityResp.Data["id"].(string)

    // Create an alias linking the auth method to this entity
    aliasReq := map[string]interface{}{
        "name":           userID,
        "canonical_id":   entityID,
        "mount_accessor": accessor,
    }
    _, err = client.Logical().Write(fmt.Sprintf("identity/entity-alias"), aliasReq)
    if err != nil {
        // Idempotency check. If it already exists, read it instead.
        if _, readErr := client.Logical().Read("identity/entity-alias/id/" + userID); readErr == nil {
            // It's ok, it exists.
        } else {
            return nil, "", fmt.Errorf("failed to write entity alias: %w", err)
        }
    }

	// Read back the alias to return it
	aliasData, err := client.Logical().Read("identity/entity-alias/id/" + userID)
	if err != nil {
		return nil, "", fmt.Errorf("failed to read back entity alias: %w", err)
	}

	entityAlias := &api.EntityAlias{
		Name: aliasData.Data["name"].(string),
	}

    return entityAlias, entityID, nil
}

This Go service performs the critical role of authenticating the client, securely mapping the client’s identity (userID) to a Vault entity, and then using that identity to request scoped AWS credentials. The createOrUpdateVaultEntity function is a crucial piece of the puzzle that was discovered during implementation; without it, Vault has no context to populate the {{identity.entity.metadata.user_id}} template variable. A common early implementation error is to try passing parameters directly in the aws/creds request body, which does not work for identity-based templating.

Solid.js Frontend: Managing the Credential Lifecycle

The frontend’s responsibility is twofold: fetch the credentials and manage their lifecycle, including proactive renewal. Solid.js’s reactive primitives are exceptionally well-suited for this task. We can create a store or a resource that encapsulates the state of the AWS credentials and the DynamoDB client instance.

We’ll use Vite with the Solid template to set up the project. The key dependencies are @aws-sdk/client-dynamodb and @aws-sdk/lib-dynamodb.

src/services/dynamo.service.ts

import { createStore, SetStoreFunction } from "solid-js/store";
import { DynamoDBClient } from "@aws-sdk/client-dynamodb";
import { DynamoDBDocumentClient } from "@aws-sdk/lib-dynamodb";

interface AwsCredentials {
  access_key: string;
  secret_key: string;
  session_token: string;
  lease_duration: number; // in seconds
}

interface CredentialState {
  credentials?: AwsCredentials;
  client?: DynamoDBDocumentClient;
  isReady: boolean;
  error?: string;
  expiresAt?: Date;
}

// This store will hold our credentials and the initialized client.
const [credentialStore, setCredentialStore] = createStore<CredentialState>({
  isReady: false,
});

let refreshTimeout: number | undefined;

// This is the core function for fetching/refreshing credentials.
const fetchCredentials = async (userId: string): Promise<void> => {
  console.log("Fetching new AWS credentials...");
  try {
    const response = await fetch("http://localhost:8080/api/v1/sts/dynamo-creds", {
      method: "POST",
      headers: {
        "Content-Type": "application/json",
        // Pass the user's identity. This would be from a JWT in a real app.
        "X-User-ID": userId, 
      },
    });

    if (!response.ok) {
      const errorData = await response.json();
      throw new Error(errorData.error || "Failed to fetch credentials");
    }

    const creds: AwsCredentials = await response.json();
    
    const dynamoClient = new DynamoDBClient({
      region: "us-east-1", // Should match your DynamoDB region
      credentials: {
        accessKeyId: creds.access_key,
        secretAccessKey: creds.secret_key,
        sessionToken: creds.session_token,
      },
    });

    const docClient = DynamoDBDocumentClient.from(dynamoClient);
    const expiresAt = new Date(Date.now() + creds.lease_duration * 1000);

    setCredentialStore({
      credentials: creds,
      client: docClient,
      isReady: true,
      error: undefined,
      expiresAt: expiresAt,
    });
    
    console.log(`Credentials successfully fetched. Valid until: ${expiresAt.toLocaleTimeString()}`);
    
    // Proactively refresh credentials before they expire.
    scheduleRefresh(creds.lease_duration, userId);

  } catch (err: any) {
    console.error("Credential fetch error:", err);
    setCredentialStore({
      error: err.message,
      isReady: false,
    });
  }
};

const scheduleRefresh = (leaseSeconds: number, userId: string) => {
  if (refreshTimeout) {
    clearTimeout(refreshTimeout);
  }
  // Schedule refresh for 80% of the lease duration to avoid race conditions.
  const refreshDelay = leaseSeconds * 0.8 * 1000;
  console.log(`Scheduling next credential refresh in ${refreshDelay / 1000} seconds.`);
  
  refreshTimeout = setTimeout(() => {
    fetchCredentials(userId);
  }, refreshDelay);
};

// Expose the store and an initializer function.
export { credentialStore, fetchCredentials };

One of the first problems encountered was handling credential expiry. A naive implementation would fetch credentials once and then fail when they expired. The scheduleRefresh function solves this. It uses setTimeout to trigger fetchCredentials again before the current lease expires, creating a seamless rolling credential window for the user. Using 80% of the lease duration provides a safe buffer against network latency.

Now, a component can use this service.

src/components/UserProfile.tsx

import { createEffect, createSignal, onMount, Show } from "solid-js";
import { GetCommand, PutCommand } from "@aws-sdk/lib-dynamodb";
import { credentialStore, fetchCredentials } from "../services/dynamo.service";

// In a real app, this would come from an auth context.
const FAKE_USER_ID = "user-1234"; 

interface UserData {
  userId: string;
  username: string;
  settings: {
    theme: string;
  };
}

const UserProfile = () => {
  const [userData, setUserData] = createSignal<UserData | null>(null);
  const [isLoading, setIsLoading] = createSignal<boolean>(true);
  const [newUsername, setNewUsername] = createSignal("");

  onMount(() => {
    // Initial credential fetch
    fetchCredentials(FAKE_USER_ID);
  });

  // Reactive effect to fetch user data once the DynamoDB client is ready.
  createEffect(async () => {
    if (credentialStore.isReady && credentialStore.client) {
      setIsLoading(true);
      try {
        const command = new GetCommand({
          TableName: "user-data",
          Key: { userId: FAKE_USER_ID },
        });
        const { Item } = await credentialStore.client.send(command);
        if (Item) {
          setUserData(Item as UserData);
          setNewUsername(Item.username);
        } else {
            // Handle case for new user
            const newUser: UserData = { userId: FAKE_USER_ID, username: 'New User', settings: { theme: 'dark' }};
            setUserData(newUser);
            setNewUsername(newUser.username);
        }
      } catch (e) {
        console.error("Failed to fetch user data from DynamoDB:", e);
      } finally {
        setIsLoading(false);
      }
    }
  });

  const handleSave = async () => {
    if (!credentialStore.client || !userData()) return;
    setIsLoading(true);
    
    const updatedData = { ...userData(), username: newUsername() };

    try {
        const command = new PutCommand({
            TableName: "user-data",
            Item: updatedData,
        });
        await credentialStore.client.send(command);
        setUserData(updatedData);
        alert("Profile saved!");
    } catch(e) {
        console.error("Failed to save user data:", e);
        alert("Failed to save profile. Check console.");
    } finally {
        setIsLoading(false);
    }
  };

  return (
    <div>
      <h2>User Profile</h2>
      <Show when={credentialStore.error}>
        <p style={{ color: "red" }}>Error: {credentialStore.error}</p>
      </Show>
      
      <Show when={!credentialStore.isReady && !credentialStore.error}>
        <p>Initializing secure connection...</p>
      </Show>

      <Show when={credentialStore.isReady && !isLoading() && userData()}>
        {(data) => (
          <div>
            <p>User ID: {data.userId}</p>
            <label>
              Username:
              <input
                type="text"
                value={newUsername()}
                onInput={(e) => setNewUsername(e.currentTarget.value)}
              />
            </label>
            <button onClick={handleSave} disabled={isLoading()}>
              Save
            </button>
          </div>
        )}
      </Show>

      <Show when={isLoading() && credentialStore.isReady}>
        <p>Loading data...</p>
      </Show>
    </div>
  );
};

export default UserProfile;

This architecture is not without its limitations. It introduces client-side complexity for managing credentials and their lifecycle. It also tightly couples the frontend to the AWS SDK, making a future migration to a different cloud provider or database more challenging. Furthermore, this pattern is only suitable for data operations that don’t require complex, transactional business logic that must be enforced server-side. It excels at offloading simple, user-isolated state management from the API server.

A potential future iteration could involve Vault’s JWT/OIDC Auth Method. This would allow the Go backend to authenticate to Vault using the same JWT provided by the client, creating a more direct and elegant trust chain without the need for AppRole credentials. Another consideration is the performance and cost implications of frequently creating IAM users in AWS. While Vault and AWS are highly optimized, at extreme scale, the latency of the initial user creation might become a factor, potentially warranting an investigation into Vault’s ability to lease and renew from a pool of pre-warmed roles rather than creating a user from scratch for every new client session.


  TOC