Implementing a Federated Micro-Frontend Architecture with a SAML-to-gRPC Authentication Bridge Using esbuild


The project started with a familiar, painful directive: modernize a sprawling, monolithic enterprise portal. Its frontend was a combination of jQuery and an early version of Angular, with a build process that took a CI runner nearly twenty minutes to complete. Any change, no matter how small, required a full redeployment. The most significant constraint, however, was the authentication system. It was deeply entangled with a corporate SAML 2.0 Identity Provider (IdP), a black box managed by a separate IT department that was resistant to change. The entire security model was predicated on server-side sessions initiated by a SAML POST binding flow, a pattern fundamentally hostile to modern Single Page Applications.

Our initial proposal was a micro-frontend architecture. The idea was to create a lightweight “shell” application responsible only for core services like routing and authentication, which would then load independent applications—the micro-frontends (MFEs)—developed and deployed by separate teams. This would solve the deployment bottleneck and allow teams to innovate on their own stacks. But two immediate, show-stopping problems emerged: how to reconcile the SAML flow with an SPA, and how to prevent the combined build time of dozens of MFEs from becoming even worse than the monolith’s.

This led to a technology selection process driven by these harsh constraints. For the build system, Webpack’s Module Federation was the obvious choice, but its configuration complexity and slower performance were concerns. We benchmarked esbuild and found its speed transformative. A cold build that took Webpack minutes finished in under five seconds. The trade-off was that esbuild lacked a native module federation solution. We decided the performance gain was worth the cost of building a simplified, custom federation mechanism.

For backend communication, the organization was already migrating to gRPC for internal services. Exposing these as REST endpoints for the frontend felt like a step backward, introducing another layer of translation and losing end-to-end type safety. gRPC-Web was the logical choice, allowing our React frontends to communicate directly with backend services using Protobuf-generated clients.

For styling, with multiple teams contributing components, style collisions were a certainty. Styled-components offered true CSS encapsulation at the component level, a non-negotiable requirement for MFE stability.

Finally, the SAML problem. No browser-side library could reliably handle the full SAML dance involving XML parsing, signature validation, and encrypted assertions. The only viable path was to introduce a new component: a lightweight Backend-for-Frontend (BFF) service. This Node.js server would act as the sole SAML Service Provider (SP). It would manage the entire interaction with the IdP, consume the SAML assertion, and in return, issue a standard JWT to the frontend shell. The shell and all subsequent MFEs would then operate in a standard stateless JWT world, blissfully unaware of the SAML complexity happening behind the curtain. This architecture forms the core of our solution.

The SAML Authentication Bridge (BFF)

The BFF’s only job is to be a translator. It speaks SAML to the IdP and JWT to the client application. In a real-world project, this service must be highly available, but for implementation purposes, a simple Express server is sufficient to prove the pattern. We used the passport-saml library, which handles most of the XML boilerplate.

The core of the setup is the passport-saml strategy configuration. The pitfall here is managing the SAML certificates. These should never be checked into source control; in production, they would be injected via a secret management system.

// bff/src/auth/saml.ts
import { Strategy as SamlStrategy, Profile } from 'passport-saml';
import passport from 'passport';
import fs from 'fs';
import path from 'path';

// WARNING: In production, load these from a secure vault, not the filesystem.
const idpCert = fs.readFileSync(path.join(__dirname, '../../certs/idp.crt'), 'utf-8');
const spPKey = fs.readFileSync(path.join(__dirname, '../../certs/sp.key'), 'utf-8');
const spCert = fs.readFileSync(path.join(__dirname, '../../certs/sp.crt'), 'utf-8');

const SAML_ENTRY_POINT = process.env.SAML_ENTRY_POINT || 'https://idp.example.com/sso';
const SAML_ISSUER = process.env.SAML_ISSUER || 'urn:example:sp';
const APP_BASE_URL = process.env.APP_BASE_URL || 'http://localhost:8080';

export const configureSaml = () => {
  const samlStrategy = new SamlStrategy(
    {
      callbackUrl: `${APP_BASE_URL}/api/auth/callback`,
      entryPoint: SAML_ENTRY_POINT,
      issuer: SAML_ISSUER,
      cert: idpCert, // IdP's public certificate to verify their signature
      privateKey: spPKey, // Our private key to sign our requests (if required)
      decryptionPvk: spPKey, // Our private key to decrypt assertions
      // In a real project, you would want to enable signature validation.
      // wantAssertionsSigned: true,
      // validateInResponseTo: true,
      disableRequestedAuthnContext: true,
    },
    (profile: Profile | any, done: (err: Error | null, user?: object) => void) => {
      // The profile object contains the user attributes from the SAML assertion.
      // A common mistake is to trust all attributes. Sanitize and map them carefully.
      console.log('SAML Profile received:', profile);
      const user = {
        id: profile.nameID,
        email: profile['urn:oid:0.9.2342.19200300.100.1.3'], // Standard OID for email
        firstName: profile['urn:oid:2.5.4.42'],
        lastName: profile['urn:oid:2.5.4.4'],
      };
      return done(null, user);
    }
  );

  passport.use(samlStrategy);

  passport.serializeUser((user, done) => {
    done(null, user);
  });

  passport.deserializeUser((user, done) => {
    done(null, user as object);
  });

  return samlStrategy;
};

With the strategy configured, the Express routes are straightforward. One route initiates the login by redirecting the user to the IdP, and the other serves as the Assertion Consumer Service (ACS) callback URL where the IdP will POST the SAML assertion.

// bff/src/server.ts
import express from 'express';
import passport from 'passport';
import jwt from 'jsonwebtoken';
import cookieParser from 'cookie-parser';
import { configureSaml } from './auth/saml';

const app = express();
const PORT = process.env.PORT || 8080;
const JWT_SECRET = process.env.JWT_SECRET || 'a-very-secret-key-that-should-be-in-a-vault';
const UI_BASE_URL = process.env.UI_BASE_URL || 'http://localhost:3000';

app.use(cookieParser());
app.use(express.urlencoded({ extended: false }));
app.use(passport.initialize());

configureSaml();

// 1. Login Initiation: Redirects user to the IdP
app.get('/api/auth/login', passport.authenticate('saml', {
  scope: [],
  failureRedirect: '/login-error',
}));

// 2. ACS Callback: IdP posts SAML assertion here
app.post(
  '/api/auth/callback',
  passport.authenticate('saml', {
    failureRedirect: '/login-error',
    session: false, // We are not using server-side sessions; JWT is the goal.
  }),
  (req, res) => {
    // 3. Assertion is valid, user object is on req.user. Now, create a JWT.
    const user = req.user as { id: string; email: string };
    const token = jwt.sign(
      { sub: user.id, email: user.email },
      JWT_SECRET,
      { expiresIn: '1h' } // Keep token lifetime short
    );

    // 4. Set the JWT in a secure, HttpOnly cookie. The frontend cannot access this via JS.
    res.cookie('auth_token', token, {
      httpOnly: true,
      secure: process.env.NODE_ENV === 'production',
      sameSite: 'strict',
      maxAge: 3600 * 1000, // 1 hour
    });
    
    // 5. Redirect the user back to the main UI application.
    res.redirect(UI_BASE_URL);
  }
);

// A simple endpoint for the UI to verify if the user is logged in
app.get('/api/auth/me', (req, res) => {
  try {
    const token = req.cookies.auth_token;
    if (!token) {
      return res.status(401).json({ message: 'Unauthorized' });
    }
    const decoded = jwt.verify(token, JWT_SECRET);
    res.json({ user: decoded });
  } catch (error) {
    res.status(401).json({ message: 'Invalid token' });
  }
});


app.listen(PORT, () => console.log(`BFF listening on port ${PORT}`));

This BFF now acts as a perfect isolation layer. The entire complexity of SAML is contained within it. The frontend shell only needs to know how to redirect to /api/auth/login and that a valid session is indicated by the presence of an auth_token cookie.

esbuild for Micro-Frontend Federation

This was the most experimental part of the architecture. Instead of a full-fledged module federation implementation, we opted for a simpler convention-based approach powered by an esbuild plugin. The strategy is as follows:

  1. Each MFE exposes an index.ts file that exports its mountable components.
  2. The MFE build process generates a standard JS bundle and a manifest.json file detailing the output path and exported components.
  3. The shell application’s build uses a custom esbuild plugin that intercepts imports from MFEs (e.g., import { MyWidget } from '@mfe/dashboard').
  4. The plugin consults the MFE’s manifest to rewrite the import path to the correct, fully-qualified URL where the MFE bundle is hosted.

Here is a simplified version of the esbuild plugin. A production version would need more robust error handling and manifest validation.

// build/mfe-plugin.js
const fs = require('fs/promises');
const path = require('path');

const mfePlugin = (options) => ({
  name: 'mfe-federation',
  setup(build) {
    const { remotes } = options; // e.g., { "@mfe/dashboard": "http://localhost:3001" }

    build.onResolve({ filter: /^@mfe\// }, async (args) => {
      const [remoteName] = args.path.match(/^@[^/]+\/[^/]+/);
      if (!remotes[remoteName]) {
        return { errors: [{ text: `Remote MFE "${remoteName}" not defined in config.` }] };
      }
      return {
        path: args.path,
        namespace: 'mfe-remotes',
        pluginData: {
          remoteBaseUrl: remotes[remoteName],
          importPath: args.path,
        },
      };
    });

    // This loader intercepts the virtual module and replaces it with code
    // to dynamically import the actual MFE bundle from its hosted URL.
    build.onLoad({ filter: /.*/, namespace: 'mfe-remotes' }, async (args) => {
      const { remoteBaseUrl, importPath } = args.pluginData;
      const manifestUrl = `${remoteBaseUrl}/manifest.json`;

      // In a real-world scenario, you'd cache this fetch.
      console.log(`Fetching manifest for ${importPath} from ${manifestUrl}`);
      const response = await fetch(manifestUrl);
      if (!response.ok) {
        return { errors: [{ text: `Failed to load manifest from ${manifestUrl}` }] };
      }
      const manifest = await response.json();
      const entryPoint = manifest.entry;
      
      const remoteUrl = `${remoteBaseUrl}/${entryPoint}`;

      // We replace the import with a dynamic import() call.
      // This is the core of the federation magic. It tells the browser
      // to fetch the MFE code at runtime.
      const contents = `
        export * from '${remoteUrl}';
      `;
      
      return {
        contents,
        loader: 'js',
        resolveDir: path.dirname(args.path),
      };
    });
  },
});

module.exports = { mfePlugin };

The build script for the shell application would then use this plugin:

// shell-app/esbuild.config.js
const esbuild = require('esbuild');
const { mfePlugin } = require('../build/mfe-plugin');

esbuild.build({
  entryPoints: ['src/index.tsx'],
  bundle: true,
  outfile: 'public/bundle.js',
  sourcemap: true,
  define: { 'process.env.NODE_ENV': '"development"' },
  plugins: [
    mfePlugin({
      remotes: {
        // This tells our plugin where to find the running MFEs.
        // In production, this would point to a CDN or static asset server.
        '@mfe/dashboard': 'http://localhost:3001',
        '@mfe/profile': 'http://localhost:3002',
      },
    }),
  ],
  loader: { '.tsx': 'tsx', '.ts': 'ts' },
}).catch(() => process.exit(1));

And an MFE’s build script would be responsible for generating that manifest.json:

// dashboard-mfe/esbuild.config.js
const esbuild = require('esbuild');
const fs = require('fs/promises');
const path = require('path');

const entryPoint = 'src/index.tsx';
const outFile = 'dist/main.js';

esbuild.build({
  entryPoints: [entryPoint],
  bundle: true,
  outfile: outFile,
  format: 'esm', // Must be ESM for dynamic import() to work correctly
  // ... other configs
}).then(async () => {
  // Generate the manifest after a successful build
  const manifest = {
    entry: path.basename(outFile),
    exports: ['./DashboardWidget'], // This could be auto-generated for more complex apps
  };
  await fs.writeFile(path.join(__dirname, 'dist', 'manifest.json'), JSON.stringify(manifest));
  console.log('MFE build complete and manifest generated.');
}).catch(() => process.exit(1));

The Shell Application and Authentication Flow

The shell is a standard React application. Its primary responsibility is to manage the authentication state and dynamically render the MFEs. We use a simple AuthContext to abstract the logic.

// shell-app/src/AuthContext.tsx
import React, { createContext, useState, useEffect, useContext, FC } from 'react';

interface AuthState {
  isAuthenticated: boolean;
  user: { sub: string; email: string } | null;
  isLoading: boolean;
}

const AuthContext = createContext<AuthState>({
  isAuthenticated: false,
  user: null,
  isLoading: true,
});

export const AuthProvider: FC<{ children: React.ReactNode }> = ({ children }) => {
  const [authState, setAuthState] = useState<AuthState>({
    isAuthenticated: false,
    user: null,
    isLoading: true,
  });

  useEffect(() => {
    const verifyUser = async () => {
      try {
        // This call goes to our BFF, which will have the HttpOnly cookie.
        const response = await fetch('/api/auth/me');
        if (response.ok) {
          const { user } = await response.json();
          setAuthState({ user, isAuthenticated: true, isLoading: false });
        } else {
          throw new Error('Not authenticated');
        }
      } catch (error) {
        setAuthState({ user: null, isAuthenticated: false, isLoading: false });
      }
    };
    verifyUser();
  }, []);

  return <AuthContext.Provider value={authState}>{children}</AuthContext.Provider>;
};

export const useAuth = () => useContext(AuthContext);

The main application component conditionally renders content or triggers the login flow.

// shell-app/src/App.tsx
import React, { Suspense } from 'react';
import { useAuth } from './AuthContext';

// Dynamic import powered by our esbuild plugin
const DashboardWidget = React.lazy(() => import('@mfe/dashboard').then(module => ({ default: module.DashboardWidget })));

const App = () => {
  const { isAuthenticated, isLoading } = useAuth();

  if (isLoading) {
    return <div>Loading session...</div>;
  }

  if (!isAuthenticated) {
    // The user is not logged in. We redirect them to the BFF's login endpoint,
    // which will kick off the SAML flow.
    window.location.href = '/api/auth/login';
    return <div>Redirecting to login...</div>;
  }

  return (
    <div>
      <h1>Enterprise Portal Shell</h1>
      <p>Welcome! You are authenticated.</p>
      <hr />
      <Suspense fallback={<div>Loading Dashboard MFE...</div>}>
        <DashboardWidget />
      </Suspense>
    </div>
  );
};

gRPC-Web Integration with JWT Interceptors

With authentication handled, the next piece is secure communication with the backend gRPC services. The frontend doesn’t have direct access to the JWT (it’s in an HttpOnly cookie), so we can’t add it to an Authorization header manually. This is actually a security benefit. Since the cookie is sent automatically by the browser for same-origin requests, our gRPC-Web requests from the shell (served on localhost:3000) to the gRPC-Web proxy (which should be behind the same domain, proxied by the BFF, for instance) will carry the cookie.

On the backend, however, the gRPC services need to validate this token. This is a perfect use case for a gRPC interceptor.

First, the .proto definition for a sample service:

// protos/user.proto
syntax = "proto3";

package user;

service UserService {
  rpc GetUserProfile(GetUserProfileRequest) returns (GetUserProfileResponse);
}

message GetUserProfileRequest {}

message GetUserProfileResponse {
  string user_id = 1;
  string email = 2;
  string display_name = 3;
}

Next, a gRPC service implementation in Node.js with a JWT-validating interceptor.

// user-service/src/server.ts
import * as grpc from '@grpc/grpc-js';
import * as protoLoader from '@grpc/proto-loader';
import jwt from 'jsonwebtoken';

const JWT_SECRET = process.env.JWT_SECRET || 'a-very-secret-key-that-should-be-in-a-vault';

// JWT Interceptor Middleware for gRPC
const jwtAuthInterceptor = (context, next) => {
  const metadata = context.clientMetadata;
  // In a real gRPC-Web setup, the JWT would be extracted from the cookie by an upstream proxy (like our BFF)
  // and injected into the metadata of the gRPC request.
  // For this example, we'll assume it's passed in an 'authorization' header.
  const authHeader = metadata.get('authorization');

  if (!authHeader || !authHeader[0]) {
    return context.sendError(new Error('Missing authentication token'), grpc.status.UNAUTHENTICATED);
  }

  const token = (authHeader[0] as string).split(' ')[1];
  try {
    const decoded = jwt.verify(token, JWT_SECRET);
    // Attach user info to context for use in service handlers
    context.user = decoded;
    return next();
  } catch (err) {
    return context.sendError(new Error('Invalid token'), grpc.status.UNAUTHENTICATED);
  }
};

const packageDefinition = protoLoader.loadSync('./protos/user.proto');
const userProto = grpc.loadPackageDefinition(packageDefinition).user;

const server = new grpc.Server({ interceptors: [jwtAuthInterceptor] });

server.addService(userProto.UserService.service, {
  GetUserProfile: (call, callback) => {
    // Thanks to the interceptor, we can trust `call.getContext().user` exists.
    const user = call.getContext().user;
    callback(null, {
      userId: user.sub,
      email: user.email,
      displayName: 'Test User',
    });
  },
});

server.bindAsync('0.0.0.0:50051', grpc.ServerCredentials.createInsecure(), () => {
  console.log('gRPC user service running on port 50051');
  server.start();
});

The MFE, using Styled-components, can now call this service.

// dashboard-mfe/src/DashboardWidget.tsx
import React, { useEffect, useState } from 'react';
import styled from 'styled-components';
// Assume generated gRPC-Web client stubs are available
import { UserServiceClient } from './generated/user_grpc_web_pb';
import { GetUserProfileRequest } from './generated/user_pb';

const client = new UserServiceClient('http://localhost:8080'); // URL of the gRPC-Web proxy

const WidgetContainer = styled.div`
  border: 2px solid #007bff;
  padding: 16px;
  border-radius: 8px;
  background-color: #f0f8ff;
  box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);
`;

const Title = styled.h2`
  color: #0056b3;
  margin-top: 0;
`;

const UserInfo = styled.p`
  font-family: monospace;
  background-color: #e9ecef;
  padding: 8px;
  border-radius: 4px;
`;

export const DashboardWidget = () => {
  const [userData, setUserData] = useState<any>(null);
  const [error, setError] = useState('');

  useEffect(() => {
    const request = new GetUserProfileRequest();
    // The browser automatically includes the 'auth_token' cookie.
    // The gRPC-Web proxy (part of our BFF) would read this cookie, validate the JWT,
    // and inject it into the metadata for the backend gRPC service.
    client.getUserProfile(request, {}, (err, response) => {
      if (err) {
        setError(`gRPC Error: ${err.message}`);
        return;
      }
      setUserData(response.toObject());
    });
  }, []);

  return (
    <WidgetContainer>
      <Title>Dashboard MFE</Title>
      {error && <p style={{ color: 'red' }}>{error}</p>}
      {userData ? (
        <UserInfo>
          User ID: {userData.userId}<br />
          Email: {userData.email}
        </UserInfo>
      ) : (
        <p>Loading user data via gRPC-Web...</p>
      )}
    </WidgetContainer>
  );
};

This diagram illustrates the full authentication and data-fetching flow:

sequenceDiagram
    participant User
    participant Browser
    participant ShellApp
    participant BFF_SAML_Bridge as BFF (SAML SP)
    participant SAML_IdP as SAML IdP
    participant gRPC_Service

    User->>Browser: Accesses portal
    Browser->>ShellApp: Loads shell
    ShellApp->>BFF: GET /api/auth/me (cookie is sent)
    BFF-->>ShellApp: 401 Unauthorized (no cookie)
    ShellApp->>Browser: Redirect to /api/auth/login
    Browser->>BFF: GET /api/auth/login
    BFF->>SAML_IdP: Redirect with SAMLRequest
    User->>SAML_IdP: Enters credentials
    SAML_IdP->>Browser: POST SAMLResponse to BFF's ACS URL
    Browser->>BFF: POST /api/auth/callback
    BFF->>BFF: Validates SAML assertion
    BFF->>BFF: Creates JWT
    BFF->>Browser: Sets HttpOnly cookie & redirects to Shell
    Browser->>ShellApp: Loads shell again (with cookie)
    ShellApp->>BFF: GET /api/auth/me
    BFF-->>ShellApp: 200 OK (user data from JWT)
    ShellApp->>ShellApp: Renders authenticated UI
    ShellApp->>Browser: Dynamically loads Dashboard MFE
    Browser->>gRPC_Service: gRPC-Web call (with cookie)
    gRPC_Service->>gRPC_Service: Interceptor validates JWT from cookie
    gRPC_Service-->>Browser: gRPC-Web response

The resulting architecture, while complex, successfully decouples the legacy authentication system from our modern frontend stack. The esbuild-powered builds are consistently fast, gRPC-Web provides type-safe and efficient data transfer, Styled-components ensures style isolation, and the entire system can be developed and deployed in a decentralized manner.

The custom esbuild federation plugin, however, remains a point of significant technical debt. It lacks the sophisticated shared dependency management and versioning control of Webpack’s Module Federation, creating a risk of runtime errors if teams are not disciplined about their dependency versions. Furthermore, the BFF, while solving the SAML problem, introduces a stateful component (the JWT signing and verification) that is a critical single point of failure and must be engineered for high availability and low latency. Future iterations would involve either replacing the custom plugin with an emerging native federation solution for esbuild or investing heavily in its feature set to make it production-hardened. The security of the BFF itself also becomes paramount; it is the gatekeeper to the entire system and must be subject to rigorous security audits.


  TOC