chore: trigger storybook v4
Deploy Storybook / Build and Deploy Storybook (push) Has been cancelled
Details
Deploy Storybook / Build and Deploy Storybook (push) Has been cancelled
Details
This commit is contained in:
parent
6928f02a9e
commit
8a1c1c0c8f
|
|
@ -730,7 +730,28 @@
|
|||
"Bash(do echo \"=== $pod ===\")",
|
||||
"Bash(do echo \"=== Request $i ===\")",
|
||||
"Bash(while read -r line)",
|
||||
"Bash(do echo $#ine)"
|
||||
"Bash(do echo $#ine)",
|
||||
"Bash(do curl -s -o /dev/null -w \"TTFB: %{time_starttransfer}s | Total: %{time_total}s\\\\n\" https://abaci.one/practice)",
|
||||
"Bash(do curl -s -o /dev/null https://abaci.one/practice)",
|
||||
"Bash(do curl -s -o /dev/null -w \"Practice TTFB: %{time_starttransfer}s | Total: %{time_total}s\\\\n\" https://abaci.one/practice)",
|
||||
"Bash(do curl -s -o /dev/null -w \"%{time_starttransfer}\\\\n\" https://abaci.one/practice)",
|
||||
"Bash(do curl -s -o /dev/null -w \"%{time_starttransfer}\\\\n\" \"https://abaci.one/\")",
|
||||
"Bash(do curl -s -o /dev/null -w \"%{time_starttransfer}\\\\n\" \"https://abaci.one/practice\")",
|
||||
"Bash(do curl -s -o /dev/null -w \"%{time_starttransfer}\\\\n\" \"https://abaci.one/api/health\")",
|
||||
"Bash(do curl -s -o /dev/null -w \"%{time_starttransfer}\\\\n\" \"https://abaci.one/games\")",
|
||||
"Bash(kill:*)",
|
||||
"WebFetch(domain:docs.drone.io)",
|
||||
"Bash(terraform import:*)",
|
||||
"Bash(terraform refresh:*)",
|
||||
"Bash(terraform state list:*)",
|
||||
"Bash(while read resource)",
|
||||
"Bash(do terraform state rm $resource)",
|
||||
"Bash(terraform state rm:*)",
|
||||
"Bash(GITEA_PASS='AbC1G1t3a!')",
|
||||
"Bash(GITEA_TOKEN=\"4dc18f18464220bebd1031c7a0e3a63f66176449\")",
|
||||
"Bash(git remote set-url:*)",
|
||||
"Bash(GITEA_TOKEN=\"4dc18f18464220bebd1031c7a0e3a63f66176449\" curl -s \"https://git.dev.abaci.one/api/v1/repos/antialias/soroban-abacus-flashcards/actions/runs\" -H \"Authorization: token $GITEA_TOKEN\")",
|
||||
"WebFetch(domain:blog.differentpla.net)"
|
||||
],
|
||||
"deny": [],
|
||||
"ask": []
|
||||
|
|
|
|||
|
|
@ -138,3 +138,4 @@ jobs:
|
|||
|
||||
# trigger
|
||||
|
||||
# v4
|
||||
|
|
|
|||
|
|
@ -240,12 +240,128 @@ ssh nas.home.network "cd /volume1/homes/antialias/projects/abaci.one && docker-c
|
|||
|
||||
## Network Configuration
|
||||
|
||||
- **Reverse Proxy**: Traefik
|
||||
- **Reverse Proxy**: Traefik (see architecture below)
|
||||
- **HTTPS**: Automatic via Traefik with Let's Encrypt
|
||||
- **Domain**: abaci.one
|
||||
- **Exposed Port**: 3000 (internal to Docker network)
|
||||
- **Load Balancing**: Traefik routes to both containers, health checks determine eligibility
|
||||
|
||||
## Traefik Ingress Architecture
|
||||
|
||||
Traffic flows through two Traefik instances:
|
||||
|
||||
```
|
||||
Internet → Traefik (Docker Compose on NAS) → Traefik (k3s) → Services
|
||||
```
|
||||
|
||||
### Traefik on Docker Compose (Primary Ingress)
|
||||
|
||||
**Location**: NAS Docker Compose
|
||||
**Role**: Entry point for all incoming traffic, TLS termination, subdomain routing
|
||||
|
||||
The Docker Compose Traefik handles:
|
||||
- TLS certificates via Let's Encrypt (ACME)
|
||||
- Routing subdomains to appropriate backends
|
||||
- HSTS headers and HTTP→HTTPS redirects
|
||||
|
||||
**Configuration files**:
|
||||
- `/volume1/homes/antialias/projects/traefik/services.yaml` - Dynamic file configuration for k8s routing
|
||||
|
||||
**Key configuration (services.yaml)**:
|
||||
```yaml
|
||||
http:
|
||||
routers:
|
||||
# Route subdomain to k8s cluster
|
||||
status-k3s:
|
||||
rule: "Host(`status.abaci.one`)"
|
||||
service: abaci-k3s
|
||||
entryPoints: ["websecure"]
|
||||
tls:
|
||||
certresolver: "myresolver"
|
||||
middlewares: ["hsts"]
|
||||
|
||||
dev-k3s:
|
||||
rule: "Host(`dev.abaci.one`)"
|
||||
service: abaci-k3s
|
||||
entryPoints: ["websecure"]
|
||||
tls:
|
||||
certresolver: "myresolver"
|
||||
middlewares: ["hsts"]
|
||||
|
||||
services:
|
||||
abaci-k3s:
|
||||
loadBalancer:
|
||||
servers:
|
||||
- url: "https://192.168.86.37" # k8s node IP
|
||||
passHostHeader: true # Forward original Host header
|
||||
serversTransport: "insecureTransport"
|
||||
|
||||
serversTransports:
|
||||
insecureTransport:
|
||||
insecureSkipVerify: true # Trust k8s internal certs
|
||||
```
|
||||
|
||||
### Traefik on k8s (Internal Routing)
|
||||
|
||||
**Role**: Routes traffic within k8s cluster based on Host header
|
||||
|
||||
The k8s Traefik receives traffic from Docker Compose Traefik with the original Host header preserved (`passHostHeader: true`), then routes to the appropriate k8s Service based on Ingress rules.
|
||||
|
||||
### Adding a New Subdomain
|
||||
|
||||
To add a new subdomain (e.g., `foo.abaci.one`):
|
||||
|
||||
1. **Add DNS record** (CNAME to abaci.one or A record to same IP)
|
||||
- Use Porkbun API (see `.claude/skills/porkbun-dns`)
|
||||
|
||||
2. **Add route to Docker Compose Traefik**:
|
||||
```bash
|
||||
ssh nas.home.network
|
||||
vi /volume1/homes/antialias/projects/traefik/services.yaml
|
||||
```
|
||||
Add router entries for both HTTPS and HTTP redirect (copy existing pattern).
|
||||
|
||||
3. **Create k8s Ingress** (in Terraform):
|
||||
```hcl
|
||||
resource "kubernetes_ingress_v1" "foo" {
|
||||
metadata {
|
||||
name = "foo"
|
||||
namespace = kubernetes_namespace.abaci.metadata[0].name
|
||||
annotations = {
|
||||
"traefik.ingress.kubernetes.io/router.entrypoints" = "websecure"
|
||||
}
|
||||
}
|
||||
spec {
|
||||
ingress_class_name = "traefik"
|
||||
rule {
|
||||
host = "foo.${var.app_domain}"
|
||||
http {
|
||||
path {
|
||||
path = "/"
|
||||
path_type = "Prefix"
|
||||
backend {
|
||||
service {
|
||||
name = kubernetes_service.foo.metadata[0].name
|
||||
port { number = 80 }
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
4. **TLS**: Docker Compose Traefik handles TLS; k8s doesn't need cert-manager for these routes
|
||||
|
||||
### Current Subdomains
|
||||
|
||||
| Subdomain | Backend | Purpose |
|
||||
|-----------|---------|---------|
|
||||
| abaci.one | k8s abaci-app | Main application |
|
||||
| status.abaci.one | k8s Gatus | Status page |
|
||||
| dev.abaci.one | k8s nginx | Build artifacts (smoke reports, storybook) |
|
||||
|
||||
## Security Notes
|
||||
|
||||
- Production database contains user data and should be handled carefully
|
||||
|
|
|
|||
|
|
@ -128,7 +128,41 @@
|
|||
"Bash(helm get values:*)",
|
||||
"Bash(kubectl set:*)",
|
||||
"Bash(kubectl annotate:*)",
|
||||
"Bash(kubectl run:*)"
|
||||
"Bash(kubectl run:*)",
|
||||
"Bash(terraform fmt:*)",
|
||||
"Bash(terraform validate:*)",
|
||||
"Bash(BASE_URL=http://localhost:3000 npx playwright test:*)",
|
||||
"Bash(pnpm list:*)",
|
||||
"Bash(pnpm exec playwright test:*)",
|
||||
"Bash(BASE_URL=http://localhost:3000 pnpm exec playwright test:*)",
|
||||
"Bash(pnpm exec playwright install:*)",
|
||||
"Bash(pnpm build:*)",
|
||||
"Bash(pnpm exec tsc:*)",
|
||||
"Bash(pnpm exec eslint:*)",
|
||||
"Bash(terraform plan:*)",
|
||||
"Bash(terraform apply:*)",
|
||||
"Bash(gh run watch:*)",
|
||||
"Bash(gh api:*)",
|
||||
"Bash(kubectl exec:*)",
|
||||
"Bash(jq:*)",
|
||||
"Bash(BASE_URL=https://abaci.one npx playwright test:*)",
|
||||
"Bash(docker build:*)",
|
||||
"Bash(docker run:*)",
|
||||
"Bash(docker ps:*)",
|
||||
"Bash(kubectl wait:*)",
|
||||
"Skill(porkbun-dns)",
|
||||
"Bash(kubectl delete:*)",
|
||||
"Bash(terraform state rm:*)",
|
||||
"Bash(kubectl patch:*)",
|
||||
"Bash(kubectl create:*)",
|
||||
"Bash(kubectl top:*)",
|
||||
"Bash(terraform init:*)",
|
||||
"Bash(kubectl:*)",
|
||||
"Bash(docker manifest inspect:*)",
|
||||
"Bash(crane digest:*)",
|
||||
"Bash(for pod in abaci-app-0 abaci-app-1 abaci-app-2)",
|
||||
"Bash(do echo '=== $pod ===')",
|
||||
"Bash(done)"
|
||||
],
|
||||
"deny": [],
|
||||
"ask": []
|
||||
|
|
|
|||
|
|
@ -1,9 +1,7 @@
|
|||
/**
|
||||
* Practice page smoke test
|
||||
*
|
||||
* Verifies that practice section is accessible via navigation.
|
||||
* Note: Direct navigation to /create pages can timeout due to heavy client-side
|
||||
* rendering, so we test via navigation from homepage instead.
|
||||
* Verifies that the practice page loads and displays player list.
|
||||
*/
|
||||
|
||||
import { expect, test } from "@playwright/test";
|
||||
|
|
@ -11,16 +9,16 @@ import { expect, test } from "@playwright/test";
|
|||
test.describe("Practice Smoke Tests", () => {
|
||||
test.setTimeout(30000);
|
||||
|
||||
test("can navigate to create page", async ({ page }) => {
|
||||
await page.goto("/");
|
||||
test("practice page loads", async ({ page }) => {
|
||||
await page.goto("/practice");
|
||||
await page.waitForLoadState("networkidle");
|
||||
|
||||
// Find and click create link
|
||||
const createLink = page.locator('a[href="/create"]').first();
|
||||
await expect(createLink).toBeVisible({ timeout: 5000 });
|
||||
await createLink.click();
|
||||
// Should be on practice page
|
||||
await expect(page).toHaveURL(/\/practice/);
|
||||
|
||||
await page.waitForLoadState("networkidle");
|
||||
await expect(page).toHaveURL(/\/create/);
|
||||
// Page should have interactive elements (indicates JS hydrated)
|
||||
await expect(page.locator("a, button").first()).toBeVisible({
|
||||
timeout: 15000,
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
|
|||
|
|
@ -0,0 +1,27 @@
|
|||
/**
|
||||
* Settings page smoke test
|
||||
*
|
||||
* Verifies that the settings page loads and displays tabs.
|
||||
*/
|
||||
|
||||
import { expect, test } from "@playwright/test";
|
||||
|
||||
test.describe("Settings Smoke Tests", () => {
|
||||
test.setTimeout(30000);
|
||||
|
||||
test("settings page loads with tabs", async ({ page }) => {
|
||||
await page.goto("/settings");
|
||||
await page.waitForLoadState("networkidle");
|
||||
|
||||
// Should be on settings page
|
||||
await expect(page).toHaveURL(/\/settings/);
|
||||
|
||||
// Settings header should be visible
|
||||
await expect(page.locator('[data-component="settings-page"]')).toBeVisible({
|
||||
timeout: 15000,
|
||||
});
|
||||
|
||||
// Tab navigation should be visible with General tab
|
||||
await expect(page.locator('[data-tab="general"]')).toBeVisible();
|
||||
});
|
||||
});
|
||||
|
|
@ -1,4 +1,4 @@
|
|||
{
|
||||
"status": "passed",
|
||||
"status": "failed",
|
||||
"failedTests": []
|
||||
}
|
||||
}
|
||||
|
|
@ -60,3 +60,23 @@ provider "registry.terraform.io/hashicorp/null" {
|
|||
"zh:ed0fe2acdb61330b01841fa790be00ec6beaac91d41f311fb8254f74eb6a711f",
|
||||
]
|
||||
}
|
||||
|
||||
provider "registry.terraform.io/hashicorp/random" {
|
||||
version = "3.8.0"
|
||||
constraints = "~> 3.6"
|
||||
hashes = [
|
||||
"h1:BYpqK2+ZHqNF9sauVugKJSeFWMCx11I/z/1lMplwUC0=",
|
||||
"zh:0e71891d8f25564e8d0b61654ed2ca52101862b9a2a07d736395193ae07b134b",
|
||||
"zh:1c56852d094161997df5fd8a6cbca7c6c979b3f8c3c00fbcc374a59305d117b1",
|
||||
"zh:20698fb8a2eaa7e23c4f8e3d22250368862f578cf618be0281d5b61496cbef13",
|
||||
"zh:3afbdd5e955f6d0105fed4f6b7fef7ba165cd780569483e688002108cf06586c",
|
||||
"zh:4ce22b96e625dc203ea653d53551d46156dd63ad79e45bcbe0224b2e6357f243",
|
||||
"zh:4ff84b568ad468d140f8f6201a372c6c4bea17d64527b72e341ae8fafea65b8e",
|
||||
"zh:54b071cb509203c43e420cc589523709bbc6e65d80c1cd9384f5bd88fd1ff1a2",
|
||||
"zh:63fc5f9f341a573cd5c8bcfc994a58fa52a5ad88d2cbbd80f5a9f143c5006e75",
|
||||
"zh:73cb8b39887589914686d14a99b4de6e85e48603f7235d87da5594e3fbb7d8a7",
|
||||
"zh:78d5eefdd9e494defcb3c68d282b8f96630502cac21d1ea161f53cfe9bb483b3",
|
||||
"zh:7ee20f28aa6a25539a5b9fc249e751dec5a5b130dcd73c5d05efdf4d5e320454",
|
||||
"zh:994a83fddab1d44a8f546920ed34e45ea6caefe4f08735bada6c28dc9010e5e4",
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -61,6 +61,9 @@ resource "kubernetes_config_map" "app_config" {
|
|||
DATABASE_URL = "/litefs/sqlite.db"
|
||||
# Trust the proxy for Auth.js
|
||||
AUTH_TRUST_HOST = "true"
|
||||
# OpenTelemetry tracing configuration
|
||||
OTEL_EXPORTER_OTLP_ENDPOINT = "http://tempo.monitoring.svc.cluster.local:4317"
|
||||
OTEL_SERVICE_NAME = "abaci-app"
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -251,7 +254,7 @@ resource "kubernetes_stateful_set" "app" {
|
|||
exec:
|
||||
- cmd: "node dist/db/migrate.js"
|
||||
if-candidate: true
|
||||
- cmd: "node server.js"
|
||||
- cmd: "node --require ./instrumentation.js server.js"
|
||||
LITEFS_CONFIG
|
||||
|
||||
exec litefs mount -config /tmp/litefs.yml
|
||||
|
|
|
|||
|
|
@ -36,6 +36,8 @@ resource "kubernetes_config_map" "gatus_config" {
|
|||
group: Arcade
|
||||
url: "https://abaci.one/games"
|
||||
interval: 120s
|
||||
client:
|
||||
timeout: 30s
|
||||
conditions:
|
||||
- "[STATUS] == 200"
|
||||
|
||||
|
|
@ -44,6 +46,8 @@ resource "kubernetes_config_map" "gatus_config" {
|
|||
group: Worksheets
|
||||
url: "https://abaci.one/create/worksheets"
|
||||
interval: 120s
|
||||
client:
|
||||
timeout: 30s
|
||||
conditions:
|
||||
- "[STATUS] == 200"
|
||||
|
||||
|
|
@ -51,6 +55,8 @@ resource "kubernetes_config_map" "gatus_config" {
|
|||
group: Worksheets
|
||||
url: "https://abaci.one/create/flashcards"
|
||||
interval: 120s
|
||||
client:
|
||||
timeout: 30s
|
||||
conditions:
|
||||
- "[STATUS] == 200"
|
||||
|
||||
|
|
@ -59,6 +65,8 @@ resource "kubernetes_config_map" "gatus_config" {
|
|||
group: Flowcharts
|
||||
url: "https://abaci.one/flowchart"
|
||||
interval: 120s
|
||||
client:
|
||||
timeout: 30s
|
||||
conditions:
|
||||
- "[STATUS] == 200"
|
||||
|
||||
|
|
|
|||
|
|
@ -629,6 +629,15 @@ resource "kubernetes_deployment" "gitea_runner" {
|
|||
}
|
||||
|
||||
spec {
|
||||
# Use Default DNS policy to use node's DNS (bypasses broken coredns)
|
||||
dns_policy = "Default"
|
||||
|
||||
# Also add hostAliases for internal services since we're not using cluster DNS
|
||||
host_aliases {
|
||||
ip = "10.43.85.76" # gitea service IP
|
||||
hostnames = ["gitea.gitea.svc.cluster.local"]
|
||||
}
|
||||
|
||||
# Docker-in-Docker sidecar for running container-based actions
|
||||
container {
|
||||
name = "dind"
|
||||
|
|
|
|||
|
|
@ -125,6 +125,103 @@ resource "helm_release" "kube_prometheus_stack" {
|
|||
depends_on = [kubernetes_namespace.monitoring]
|
||||
}
|
||||
|
||||
# =============================================================================
|
||||
# Grafana Tempo - Distributed Tracing
|
||||
# =============================================================================
|
||||
# Receives traces via OTLP from instrumented applications.
|
||||
# Integrates with Grafana for trace visualization.
|
||||
|
||||
resource "helm_release" "tempo" {
|
||||
name = "tempo"
|
||||
repository = "https://grafana.github.io/helm-charts"
|
||||
chart = "tempo"
|
||||
version = "1.7.2"
|
||||
namespace = kubernetes_namespace.monitoring.metadata[0].name
|
||||
|
||||
timeout = 300
|
||||
|
||||
values = [yamlencode({
|
||||
tempo = {
|
||||
# Retention period for traces
|
||||
retention = "168h" # 7 days
|
||||
# Resource limits for small cluster
|
||||
resources = {
|
||||
requests = {
|
||||
memory = "256Mi"
|
||||
cpu = "100m"
|
||||
}
|
||||
limits = {
|
||||
memory = "512Mi"
|
||||
cpu = "500m"
|
||||
}
|
||||
}
|
||||
}
|
||||
# Enable trace ingestion via OTLP
|
||||
traces = {
|
||||
otlp = {
|
||||
grpc = {
|
||||
enabled = true
|
||||
}
|
||||
http = {
|
||||
enabled = true
|
||||
}
|
||||
}
|
||||
}
|
||||
# Persistence for trace data
|
||||
persistence = {
|
||||
enabled = true
|
||||
size = "5Gi"
|
||||
storageClassName = "local-path"
|
||||
}
|
||||
})]
|
||||
|
||||
depends_on = [kubernetes_namespace.monitoring]
|
||||
}
|
||||
|
||||
# Add Tempo as a datasource in Grafana
|
||||
# The kube-prometheus-stack Grafana sidecar will pick this up
|
||||
resource "kubernetes_config_map" "grafana_datasource_tempo" {
|
||||
metadata {
|
||||
name = "grafana-datasource-tempo"
|
||||
namespace = kubernetes_namespace.monitoring.metadata[0].name
|
||||
labels = {
|
||||
grafana_datasource = "1"
|
||||
}
|
||||
}
|
||||
|
||||
data = {
|
||||
"tempo-datasource.yaml" = yamlencode({
|
||||
apiVersion = 1
|
||||
datasources = [{
|
||||
name = "Tempo"
|
||||
type = "tempo"
|
||||
access = "proxy"
|
||||
url = "http://tempo:3100"
|
||||
isDefault = false
|
||||
jsonData = {
|
||||
tracesToLogsV2 = {
|
||||
datasourceUid = "prometheus"
|
||||
}
|
||||
tracesToMetrics = {
|
||||
datasourceUid = "prometheus"
|
||||
}
|
||||
serviceMap = {
|
||||
datasourceUid = "prometheus"
|
||||
}
|
||||
nodeGraph = {
|
||||
enabled = true
|
||||
}
|
||||
lokiSearch = {
|
||||
datasourceUid = ""
|
||||
}
|
||||
}
|
||||
}]
|
||||
})
|
||||
}
|
||||
|
||||
depends_on = [helm_release.tempo, helm_release.kube_prometheus_stack]
|
||||
}
|
||||
|
||||
# Grafana Ingress with TLS (grafana.dev.abaci.one)
|
||||
resource "kubernetes_ingress_v1" "grafana" {
|
||||
metadata {
|
||||
|
|
|
|||
|
|
@ -10,9 +10,9 @@ case `uname` in
|
|||
esac
|
||||
|
||||
if [ -z "$NODE_PATH" ]; then
|
||||
export NODE_PATH="/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/vitest@1.6.1_@types+node@20.19.19_@vitest+ui@3.2.4_happy-dom@18.0.1_jsdom@27.0.0_postcss@8.5.6__terser@5.44.0/node_modules/vitest/node_modules:/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/vitest@1.6.1_@types+node@20.19.19_@vitest+ui@3.2.4_happy-dom@18.0.1_jsdom@27.0.0_postcss@8.5.6__terser@5.44.0/node_modules:/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/node_modules"
|
||||
export NODE_PATH="/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/vitest@1.6.1_@types+node@20.19.19_happy-dom@18.0.1_jsdom@27.0.0_canvas-mock@0.0.0_postc_689121289fd571ce3011c17e1e57c748/node_modules/vitest/node_modules:/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/vitest@1.6.1_@types+node@20.19.19_happy-dom@18.0.1_jsdom@27.0.0_canvas-mock@0.0.0_postc_689121289fd571ce3011c17e1e57c748/node_modules:/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/node_modules"
|
||||
else
|
||||
export NODE_PATH="/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/vitest@1.6.1_@types+node@20.19.19_@vitest+ui@3.2.4_happy-dom@18.0.1_jsdom@27.0.0_postcss@8.5.6__terser@5.44.0/node_modules/vitest/node_modules:/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/vitest@1.6.1_@types+node@20.19.19_@vitest+ui@3.2.4_happy-dom@18.0.1_jsdom@27.0.0_postcss@8.5.6__terser@5.44.0/node_modules:/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/node_modules:$NODE_PATH"
|
||||
export NODE_PATH="/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/vitest@1.6.1_@types+node@20.19.19_happy-dom@18.0.1_jsdom@27.0.0_canvas-mock@0.0.0_postc_689121289fd571ce3011c17e1e57c748/node_modules/vitest/node_modules:/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/vitest@1.6.1_@types+node@20.19.19_happy-dom@18.0.1_jsdom@27.0.0_canvas-mock@0.0.0_postc_689121289fd571ce3011c17e1e57c748/node_modules:/Users/antialias/projects/soroban-abacus-flashcards/node_modules/.pnpm/node_modules:$NODE_PATH"
|
||||
fi
|
||||
if [ -x "$basedir/node" ]; then
|
||||
exec "$basedir/node" "$basedir/../vitest/vitest.mjs" "$@"
|
||||
|
|
|
|||
|
|
@ -1 +1 @@
|
|||
../../../../../node_modules/.pnpm/vitest@1.6.1_@types+node@20.19.19_@vitest+ui@3.2.4_happy-dom@18.0.1_jsdom@27.0.0_postcss@8.5.6__terser@5.44.0/node_modules/vitest
|
||||
../../../../../node_modules/.pnpm/vitest@1.6.1_@types+node@20.19.19_happy-dom@18.0.1_jsdom@27.0.0_canvas-mock@0.0.0_postc_689121289fd571ce3011c17e1e57c748/node_modules/vitest
|
||||
Loading…
Reference in New Issue