Maestro E2E Pipeline

Full Android E2E pipeline: GitHub Actions → Maestro → Dashboard

Trigger, run, and analyze Android end-to-end flows with a single dashboard. This page captures the end-to-end structure, scripts, and CI wiring.

Quick map

  • Maestro flows live in maestro/flows/
  • Dashboard API + SQLite in dashboard/server.js
  • CI orchestration in .github/workflows/e2e.yml

Project Structure

e2e-pipeline/
├── .github/workflows/
│   └── e2e.yml              # CI: triggers on push to develop
├── maestro/
│   ├── flows/               # Maestro YAML test flows
│   │   ├── 01_login.yaml
│   └── config/
│       └── config.yaml      # Default env vars
├── dashboard/
│   ├── server.js            # Express API + SQLite
│   ├── public/index.html    # Analytics dashboard SPA
│   ├── Dockerfile
│   └── package.json
├── docker/
│   └── nginx.conf           # Reverse proxy config
└── docker-compose.yml

Phase 1 — Local Setup

1. Install Maestro

curl -Ls "https://get.maestro.mobile.dev" | bash

2. Run the dashboard locally

cd dashboard
npm install
DASHBOARD_API_KEY=dev-key node server.js
# Open http://localhost:3000

3. Run Maestro flows locally

# Start your AVD first in Android Studio, then:
maestro test \
  --env TEST_ACCOUNT=test@yourapp.com \
  --env APP_PACKAGE=com.yourapp.debug \
  maestro/flows/

Phase 2 — Docker (Production)

1. Create .env file

cp .env.example .env
# Edit DASHBOARD_API_KEY

2. Build and start

docker compose up -d --build

3. Persist data

SQLite is stored in the dashboard-data Docker volume — survives container restarts.

CI Setup (GitHub Actions)

Secrets to add in your repo Settings → Secrets:

TEST_ACCOUNT_1   First test account email
TEST_ACCOUNT_2   Second test account email
TEST_PASSWORD    Test account password
DASHBOARD_URL    e.g. http://your-server-ip
DASHBOARD_API_KEY Same key as in your .env

Parallelism

Each avd_index in the matrix runs a separate AVD with its own test account. Add more entries to matrix.avd_index to scale up parallel runs.

Results

Live results are served from the dashboard API. Use these endpoints to inspect recent runs.

GET /api/runs\nGET /api/runs?limit=50\nGET /api/runs?status=failed\nGET /api/stats

Open them directly in the browser to see JSON, or connect a frontend to visualize.

Embedded results view: Open in new tab

Adding New Flows

Create maestro/flows/03_my_feature.yaml

Tag it: tags: [smoke] or tags: [regression]

Push to develop — CI picks it up automatically.

.env.example

DASHBOARD_API_KEY=change-me-in-production

e2e.yml file

name: E2E Tests (Maestro)

on:
  push:
    branches: [develop]
  pull_request:
    branches: [develop, main]
  workflow_dispatch:
    inputs:
      avd_count:
        description: 'Number of parallel AVDs'
        required: false
        default: '2'

env:
  MAESTRO_VERSION: '1.38.1'
  API_LEVEL: 34
  APP_PACKAGE: com.yourapp.debug   # <-- change this

jobs:
  e2e:
    name: E2E on AVD ${{ matrix.avd_index }}
    runs-on: ubuntu-latest
    strategy:
      fail-fast: false
      matrix:
        avd_index: [1, 2]           # add more to match avd_count
        include:
          - avd_index: 1
            test_account: ${{ secrets.TEST_ACCOUNT_1 }}
          - avd_index: 2
            test_account: ${{ secrets.TEST_ACCOUNT_2 }}

    steps:
      - name: Checkout
        uses: actions/checkout@v4

      - name: Set up JDK 17
        uses: actions/setup-java@v4
        with:
          java-version: '17'
          distribution: 'temurin'

      - name: Install Maestro
        run: |
          curl -Ls "https://get.maestro.mobile.dev" | bash
          echo "$HOME/.maestro/bin" >> $GITHUB_PATH

      - name: Enable KVM (AVD acceleration)
        run: |
          echo 'KERNEL=="kvm", GROUP="kvm", MODE="0666", OPTIONS+="static_node=kvm"' | sudo tee /etc/udev/rules.d/99-kvm4all.rules
          sudo udevadm control --reload-rules
          sudo udevadm trigger --name-match=kvm

      - name: Cache AVD snapshot
        uses: actions/cache@v4
        id: avd-cache
        with:
          path: |
            ~/.android/avd/*
            ~/.android/adb*
          key: avd-api${{ env.API_LEVEL }}-${{ runner.os }}

      - name: Create AVD
        if: steps.avd-cache.outputs.cache-hit != 'true'
        uses: reactivecircus/android-emulator-runner@v2
        with:
          api-level: ${{ env.API_LEVEL }}
          target: google_apis
          arch: x86_64
          profile: pixel_6
          force-avd-creation: false
          emulator-options: -no-window -gpu swiftshader_indirect -noaudio -no-boot-anim
          disable-animations: true
          script: echo "AVD created and cached"

      - name: Download APK artifact
        uses: actions/download-artifact@v4
        with:
          name: debug-apk
          path: ./apk
        continue-on-error: true   # skip if no APK artifact in this run

      - name: Run Maestro E2E Tests
        uses: reactivecircus/android-emulator-runner@v2
        env:
          TEST_ACCOUNT: ${{ matrix.test_account }}
          MAESTRO_DRIVER_STARTUP_TIMEOUT: 60000
        with:
          api-level: ${{ env.API_LEVEL }}
          target: google_apis
          arch: x86_64
          profile: pixel_6
          emulator-options: -no-window -gpu swiftshader_indirect -noaudio -no-boot-anim
          disable-animations: true
          script: |
            # Install APK if present
            if ls ./apk/*.apk 1> /dev/null 2>&1; then
              adb install -r ./apk/*.apk
            fi

            # Run all flows in parallel per avd_index
            maestro test \
              --env TEST_ACCOUNT="$TEST_ACCOUNT" \
              --env AVD_INDEX="${{ matrix.avd_index }}" \
              --format junit \
              --output ./results/report-avd${{ matrix.avd_index }}.xml \
              maestro/flows/

      - name: Upload test results
        if: always()
        uses: actions/upload-artifact@v4
        with:
          name: maestro-results-avd${{ matrix.avd_index }}
          path: ./results/

      - name: Post results to Dashboard
        if: always()
        run: |
          STATUS="passed"
          if [ "${{ job.status }}" != "success" ]; then STATUS="failed"; fi

          curl -s -X POST "${{ secrets.DASHBOARD_URL }}/api/runs" \
            -H "Content-Type: application/json" \
            -H "X-API-Key: ${{ secrets.DASHBOARD_API_KEY }}" \
            -d '{
              "avd_index": ${{ matrix.avd_index }},
              "status": "'"$STATUS"'",
              "branch": "${{ github.ref_name }}",
              "commit": "${{ github.sha }}",
              "run_id": "${{ github.run_id }}"
            }'

server.js

const express = require('express');
const Database = require('better-sqlite3');
const cors = require('cors');
const path = require('path');

const app = express();
const PORT = process.env.PORT || 3000;
const API_KEY = process.env.DASHBOARD_API_KEY || 'dev-key';

// --- DB Setup ---
const db = new Database(process.env.DB_PATH || './data/runs.db');

db.exec(`
  CREATE TABLE IF NOT EXISTS runs (
    id          INTEGER PRIMARY KEY AUTOINCREMENT,
    avd_index   INTEGER NOT NULL,
    status      TEXT NOT NULL CHECK(status IN ('passed','failed','running')),
    branch      TEXT NOT NULL,
    commit      TEXT,
    run_id      TEXT,
    created_at  TEXT DEFAULT (datetime('now'))
  );

  CREATE TABLE IF NOT EXISTS screenshots (
    id      INTEGER PRIMARY KEY AUTOINCREMENT,
    run_id  INTEGER REFERENCES runs(id),
    name    TEXT,
    path    TEXT,
    taken_at TEXT DEFAULT (datetime('now'))
  );
`);

// --- Middleware ---
app.use(cors());
app.use(express.json());
app.use(express.static(path.join(__dirname, 'public')));

function requireApiKey(req, res, next) {
  const key = req.headers['x-api-key'];
  if (key !== API_KEY) return res.status(401).json({ error: 'Unauthorized' });
  next();
}

// --- API Routes ---

// POST /api/runs — called by CI at end of each AVD job
app.post('/api/runs', requireApiKey, (req, res) => {
  const { avd_index, status, branch, commit, run_id } = req.body;

  if (!avd_index || !status || !branch) {
    return res.status(400).json({ error: 'avd_index, status, branch required' });
  }

  const stmt = db.prepare(`
    INSERT INTO runs (avd_index, status, branch, commit, run_id)
    VALUES (?, ?, ?, ?, ?)
  `);
  const result = stmt.run(avd_index, status, branch, commit || null, run_id || null);

  res.status(201).json({ id: result.lastInsertRowid });
});

// GET /api/runs — all runs with optional filters
app.get('/api/runs', (req, res) => {
  const { branch, status, limit = 50 } = req.query;

  let query = 'SELECT * FROM runs WHERE 1=1';
  const params = [];

  if (branch) { query += ' AND branch = ?'; params.push(branch); }
  if (status) { query += ' AND status = ?'; params.push(status); }

  query += ' ORDER BY created_at DESC LIMIT ?';
  params.push(Number(limit));

  const rows = db.prepare(query).all(...params);
  res.json(rows);
});

// GET /api/stats — aggregate stats for dashboard cards
app.get('/api/stats', (req, res) => {
  const total    = db.prepare(`SELECT COUNT(*) as c FROM runs`).get().c;
  const passed   = db.prepare(`SELECT COUNT(*) as c FROM runs WHERE status='passed'`).get().c;
  const failed   = db.prepare(`SELECT COUNT(*) as c FROM runs WHERE status='failed'`).get().c;
  const passRate = total > 0 ? Math.round((passed / total) * 100) : 0;

  const recentFails = db.prepare(`
    SELECT branch, COUNT(*) as c
    FROM runs
    WHERE status='failed' AND created_at >= datetime('now', '-7 days')
    GROUP BY branch
    ORDER BY c DESC
    LIMIT 5
  `).all();

  const trend = db.prepare(`
    SELECT date(created_at) as day,
           SUM(CASE WHEN status='passed' THEN 1 ELSE 0 END) as passed,
           SUM(CASE WHEN status='failed' THEN 1 ELSE 0 END) as failed
    FROM runs
    WHERE created_at >= datetime('now', '-14 days')
    GROUP BY day
    ORDER BY day ASC
  `).all();

  res.json({ total, passed, failed, passRate, recentFails, trend });
});

// GET /api/runs/:id
app.get('/api/runs/:id', (req, res) => {
  const run = db.prepare('SELECT * FROM runs WHERE id = ?').get(req.params.id);
  if (!run) return res.status(404).json({ error: 'Not found' });
  res.json(run);
});

// DELETE /api/runs/:id — for cleanup
app.delete('/api/runs/:id', requireApiKey, (req, res) => {
  db.prepare('DELETE FROM runs WHERE id = ?').run(req.params.id);
  res.json({ ok: true });
});

// Serve dashboard SPA for all other routes
app.get('*', (req, res) => {
  res.sendFile(path.join(__dirname, 'public', 'index.html'));
});

app.listen(PORT, () => {
  console.log(`Dashboard running on http://localhost:${PORT}`);
});

Docker compose

version: '3.9'

services:

  # ── Dashboard (Trigger + Analytics) ──────────────────────────────────────
  dashboard:
    build:
      context: ./dashboard
      dockerfile: Dockerfile
    container_name: maestro-dashboard
    restart: unless-stopped
    ports:
      - "3000:3000"
    environment:
      - PORT=3000
      - DASHBOARD_API_KEY=${DASHBOARD_API_KEY:-change-me-in-production}
      - DB_PATH=/app/data/runs.db
    volumes:
      - dashboard-data:/app/data     # persist SQLite across restarts
    healthcheck:
      test: ["CMD", "wget", "-qO-", "http://localhost:3000/api/stats"]
      interval: 30s
      timeout: 10s
      retries: 3
    networks:
      - maestro-net

  # ── Nginx reverse proxy (optional — expose dashboard on port 80) ──────────
  nginx:
    image: nginx:alpine
    container_name: maestro-nginx
    restart: unless-stopped
    ports:
      - "80:80"
    volumes:
      - ./docker/nginx.conf:/etc/nginx/conf.d/default.conf:ro
    depends_on:
      - dashboard
    networks:
      - maestro-net

volumes:
  dashboard-data:

networks:
  maestro-net:
    driver: bridge

Docker File

FROM node:20-alpine

WORKDIR /app

# Install dependencies first (layer cache)
COPY package.json ./
RUN npm install --production

# Copy source
COPY server.js ./
COPY public/ ./public/

# Data directory for SQLite (mount a volume here in prod)
RUN mkdir -p /app/data

ENV PORT=3000
ENV NODE_ENV=production

EXPOSE 3000

CMD ["node", "server.js"]