health checks (#39)

resolves #35

Co-authored-by: openhands <openhands@all-hands.dev>
Co-authored-by: johba <johba@harb.eth>
Reviewed-on: https://codeberg.org/johba/harb/pulls/39
This commit is contained in:
johba 2025-10-02 14:37:59 +02:00
parent a29ca1a26a
commit 3ab2d9454a
14 changed files with 586 additions and 611 deletions

View file

@ -11,7 +11,7 @@
3. **Compete** - Snatch undervalued positions to optimise returns.
## Operating the Stack
- Start everything with `nohup ./scripts/local_env.sh start &` and stop via `./scripts/local_env.sh stop`. Do not launch services individually.
- Start everything with `nohup ./scripts/dev.sh start &` and stop via `./scripts/dev.sh stop`. Do not launch services individually.
- Supported environments: `BASE_SEPOLIA_LOCAL_FORK` (default Anvil fork), `BASE_SEPOLIA`, and `BASE`. Match contract addresses and RPCs accordingly.
- The stack boots Anvil, deploys contracts, seeds liquidity, starts Ponder, launches the landing site, and runs the txnBot. Wait for logs to settle before manual testing.

View file

@ -1,8 +1,11 @@
FROM node:20-bookworm
FROM node:20-alpine
RUN apt-get update \
&& apt-get install -y --no-install-recommends dumb-init \
&& rm -rf /var/lib/apt/lists/*
RUN apk add --no-cache \
dumb-init \
git \
bash \
postgresql-client \
wget
USER node
WORKDIR /workspace

View file

@ -1,6 +1,6 @@
# Podman Staging Environment
The Podman stack mirrors `scripts/local_env.sh` using long-lived containers. Every boot spins up a fresh Base Sepolia fork, redeploys contracts, seeds liquidity, and launches the live-reload services behind Caddy on port 80.
The Podman stack mirrors `scripts/dev.sh` using long-lived containers. Every boot spins up a fresh Base Sepolia fork, redeploys contracts, seeds liquidity, and launches the live-reload services behind Caddy on port 80.
## Service Topology
- `anvil` Base Sepolia fork with optional mnemonic from `onchain/.secret.local`

View file

@ -1,6 +1,6 @@
overwrite: true
schema: "./schema-with-scalars.graphql" # Or an endpoint if your schema is remote
documents: "../subgraph/base_sepolia/src/**/*.graphql"
schema: "../services/ponder/generated/schema.graphql"
documents: []
generates:
src/__generated__/graphql.ts:
config:
@ -11,4 +11,4 @@ generates:
Int8: number
plugins:
- "typescript"
- "typescript-operations"
- "typescript-operations"

View file

@ -12,41 +12,442 @@ export type Scalars = {
Boolean: { input: boolean; output: boolean; }
Int: { input: number; output: number; }
Float: { input: number; output: number; }
BigDecimal: { input: any; output: any; }
BigInt: { input: any; output: any; }
Bytes: { input: any; output: any; }
BigInt: { input: string; output: string; }
/** The `JSON` scalar type represents JSON values as specified by [ECMA-404](http://www.ecma-international.org/publications/files/ECMA-ST/ECMA-404.pdf). */
JSON: { input: any; output: any; }
};
export type Position = {
__typename?: 'Position';
creationTime: Scalars['Int']['output'];
id: Scalars['Bytes']['output'];
lastTaxTime?: Maybe<Scalars['Int']['output']>;
owner: Scalars['Bytes']['output'];
share: Scalars['BigDecimal']['output'];
status: PositionStatus;
taxRate: Scalars['BigDecimal']['output'];
export type Meta = {
__typename?: 'Meta';
status?: Maybe<Scalars['JSON']['output']>;
};
export enum PositionStatus {
Active = 'Active',
Closed = 'Closed'
}
export type PageInfo = {
__typename?: 'PageInfo';
endCursor?: Maybe<Scalars['String']['output']>;
hasNextPage: Scalars['Boolean']['output'];
hasPreviousPage: Scalars['Boolean']['output'];
startCursor?: Maybe<Scalars['String']['output']>;
};
export type Query = {
__typename?: 'Query';
positions?: Maybe<Array<Maybe<Position>>>;
stats?: Maybe<Array<Maybe<Stats>>>;
_meta?: Maybe<Meta>;
positions?: Maybe<Positions>;
positionss: PositionsPage;
stats?: Maybe<Stats>;
statss: StatsPage;
};
export type QueryPositionsArgs = {
id: Scalars['String']['input'];
};
export type QueryPositionssArgs = {
after?: InputMaybe<Scalars['String']['input']>;
before?: InputMaybe<Scalars['String']['input']>;
limit?: InputMaybe<Scalars['Int']['input']>;
orderBy?: InputMaybe<Scalars['String']['input']>;
orderDirection?: InputMaybe<Scalars['String']['input']>;
where?: InputMaybe<PositionsFilter>;
};
export type QueryStatsArgs = {
id: Scalars['String']['input'];
};
export type QueryStatssArgs = {
after?: InputMaybe<Scalars['String']['input']>;
before?: InputMaybe<Scalars['String']['input']>;
limit?: InputMaybe<Scalars['Int']['input']>;
orderBy?: InputMaybe<Scalars['String']['input']>;
orderDirection?: InputMaybe<Scalars['String']['input']>;
where?: InputMaybe<StatsFilter>;
};
export type Positions = {
__typename?: 'positions';
closedAt?: Maybe<Scalars['BigInt']['output']>;
createdAt: Scalars['BigInt']['output'];
creationTime: Scalars['BigInt']['output'];
id: Scalars['String']['output'];
kraikenDeposit: Scalars['BigInt']['output'];
lastTaxTime: Scalars['BigInt']['output'];
owner: Scalars['String']['output'];
payout: Scalars['BigInt']['output'];
share: Scalars['Float']['output'];
snatched: Scalars['Int']['output'];
stakeDeposit: Scalars['BigInt']['output'];
status: Scalars['String']['output'];
taxPaid: Scalars['BigInt']['output'];
taxRate: Scalars['Float']['output'];
totalSupplyEnd?: Maybe<Scalars['BigInt']['output']>;
totalSupplyInit: Scalars['BigInt']['output'];
};
export type PositionsFilter = {
AND?: InputMaybe<Array<InputMaybe<PositionsFilter>>>;
OR?: InputMaybe<Array<InputMaybe<PositionsFilter>>>;
closedAt?: InputMaybe<Scalars['BigInt']['input']>;
closedAt_gt?: InputMaybe<Scalars['BigInt']['input']>;
closedAt_gte?: InputMaybe<Scalars['BigInt']['input']>;
closedAt_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
closedAt_lt?: InputMaybe<Scalars['BigInt']['input']>;
closedAt_lte?: InputMaybe<Scalars['BigInt']['input']>;
closedAt_not?: InputMaybe<Scalars['BigInt']['input']>;
closedAt_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
createdAt?: InputMaybe<Scalars['BigInt']['input']>;
createdAt_gt?: InputMaybe<Scalars['BigInt']['input']>;
createdAt_gte?: InputMaybe<Scalars['BigInt']['input']>;
createdAt_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
createdAt_lt?: InputMaybe<Scalars['BigInt']['input']>;
createdAt_lte?: InputMaybe<Scalars['BigInt']['input']>;
createdAt_not?: InputMaybe<Scalars['BigInt']['input']>;
createdAt_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
creationTime?: InputMaybe<Scalars['BigInt']['input']>;
creationTime_gt?: InputMaybe<Scalars['BigInt']['input']>;
creationTime_gte?: InputMaybe<Scalars['BigInt']['input']>;
creationTime_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
creationTime_lt?: InputMaybe<Scalars['BigInt']['input']>;
creationTime_lte?: InputMaybe<Scalars['BigInt']['input']>;
creationTime_not?: InputMaybe<Scalars['BigInt']['input']>;
creationTime_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
id?: InputMaybe<Scalars['String']['input']>;
id_contains?: InputMaybe<Scalars['String']['input']>;
id_ends_with?: InputMaybe<Scalars['String']['input']>;
id_in?: InputMaybe<Array<InputMaybe<Scalars['String']['input']>>>;
id_not?: InputMaybe<Scalars['String']['input']>;
id_not_contains?: InputMaybe<Scalars['String']['input']>;
id_not_ends_with?: InputMaybe<Scalars['String']['input']>;
id_not_in?: InputMaybe<Array<InputMaybe<Scalars['String']['input']>>>;
id_not_starts_with?: InputMaybe<Scalars['String']['input']>;
id_starts_with?: InputMaybe<Scalars['String']['input']>;
kraikenDeposit?: InputMaybe<Scalars['BigInt']['input']>;
kraikenDeposit_gt?: InputMaybe<Scalars['BigInt']['input']>;
kraikenDeposit_gte?: InputMaybe<Scalars['BigInt']['input']>;
kraikenDeposit_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
kraikenDeposit_lt?: InputMaybe<Scalars['BigInt']['input']>;
kraikenDeposit_lte?: InputMaybe<Scalars['BigInt']['input']>;
kraikenDeposit_not?: InputMaybe<Scalars['BigInt']['input']>;
kraikenDeposit_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
lastTaxTime?: InputMaybe<Scalars['BigInt']['input']>;
lastTaxTime_gt?: InputMaybe<Scalars['BigInt']['input']>;
lastTaxTime_gte?: InputMaybe<Scalars['BigInt']['input']>;
lastTaxTime_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
lastTaxTime_lt?: InputMaybe<Scalars['BigInt']['input']>;
lastTaxTime_lte?: InputMaybe<Scalars['BigInt']['input']>;
lastTaxTime_not?: InputMaybe<Scalars['BigInt']['input']>;
lastTaxTime_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
owner?: InputMaybe<Scalars['String']['input']>;
owner_contains?: InputMaybe<Scalars['String']['input']>;
owner_ends_with?: InputMaybe<Scalars['String']['input']>;
owner_in?: InputMaybe<Array<InputMaybe<Scalars['String']['input']>>>;
owner_not?: InputMaybe<Scalars['String']['input']>;
owner_not_contains?: InputMaybe<Scalars['String']['input']>;
owner_not_ends_with?: InputMaybe<Scalars['String']['input']>;
owner_not_in?: InputMaybe<Array<InputMaybe<Scalars['String']['input']>>>;
owner_not_starts_with?: InputMaybe<Scalars['String']['input']>;
owner_starts_with?: InputMaybe<Scalars['String']['input']>;
payout?: InputMaybe<Scalars['BigInt']['input']>;
payout_gt?: InputMaybe<Scalars['BigInt']['input']>;
payout_gte?: InputMaybe<Scalars['BigInt']['input']>;
payout_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
payout_lt?: InputMaybe<Scalars['BigInt']['input']>;
payout_lte?: InputMaybe<Scalars['BigInt']['input']>;
payout_not?: InputMaybe<Scalars['BigInt']['input']>;
payout_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
share?: InputMaybe<Scalars['Float']['input']>;
share_gt?: InputMaybe<Scalars['Float']['input']>;
share_gte?: InputMaybe<Scalars['Float']['input']>;
share_in?: InputMaybe<Array<InputMaybe<Scalars['Float']['input']>>>;
share_lt?: InputMaybe<Scalars['Float']['input']>;
share_lte?: InputMaybe<Scalars['Float']['input']>;
share_not?: InputMaybe<Scalars['Float']['input']>;
share_not_in?: InputMaybe<Array<InputMaybe<Scalars['Float']['input']>>>;
snatched?: InputMaybe<Scalars['Int']['input']>;
snatched_gt?: InputMaybe<Scalars['Int']['input']>;
snatched_gte?: InputMaybe<Scalars['Int']['input']>;
snatched_in?: InputMaybe<Array<InputMaybe<Scalars['Int']['input']>>>;
snatched_lt?: InputMaybe<Scalars['Int']['input']>;
snatched_lte?: InputMaybe<Scalars['Int']['input']>;
snatched_not?: InputMaybe<Scalars['Int']['input']>;
snatched_not_in?: InputMaybe<Array<InputMaybe<Scalars['Int']['input']>>>;
stakeDeposit?: InputMaybe<Scalars['BigInt']['input']>;
stakeDeposit_gt?: InputMaybe<Scalars['BigInt']['input']>;
stakeDeposit_gte?: InputMaybe<Scalars['BigInt']['input']>;
stakeDeposit_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
stakeDeposit_lt?: InputMaybe<Scalars['BigInt']['input']>;
stakeDeposit_lte?: InputMaybe<Scalars['BigInt']['input']>;
stakeDeposit_not?: InputMaybe<Scalars['BigInt']['input']>;
stakeDeposit_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
status?: InputMaybe<Scalars['String']['input']>;
status_contains?: InputMaybe<Scalars['String']['input']>;
status_ends_with?: InputMaybe<Scalars['String']['input']>;
status_in?: InputMaybe<Array<InputMaybe<Scalars['String']['input']>>>;
status_not?: InputMaybe<Scalars['String']['input']>;
status_not_contains?: InputMaybe<Scalars['String']['input']>;
status_not_ends_with?: InputMaybe<Scalars['String']['input']>;
status_not_in?: InputMaybe<Array<InputMaybe<Scalars['String']['input']>>>;
status_not_starts_with?: InputMaybe<Scalars['String']['input']>;
status_starts_with?: InputMaybe<Scalars['String']['input']>;
taxPaid?: InputMaybe<Scalars['BigInt']['input']>;
taxPaid_gt?: InputMaybe<Scalars['BigInt']['input']>;
taxPaid_gte?: InputMaybe<Scalars['BigInt']['input']>;
taxPaid_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
taxPaid_lt?: InputMaybe<Scalars['BigInt']['input']>;
taxPaid_lte?: InputMaybe<Scalars['BigInt']['input']>;
taxPaid_not?: InputMaybe<Scalars['BigInt']['input']>;
taxPaid_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
taxRate?: InputMaybe<Scalars['Float']['input']>;
taxRate_gt?: InputMaybe<Scalars['Float']['input']>;
taxRate_gte?: InputMaybe<Scalars['Float']['input']>;
taxRate_in?: InputMaybe<Array<InputMaybe<Scalars['Float']['input']>>>;
taxRate_lt?: InputMaybe<Scalars['Float']['input']>;
taxRate_lte?: InputMaybe<Scalars['Float']['input']>;
taxRate_not?: InputMaybe<Scalars['Float']['input']>;
taxRate_not_in?: InputMaybe<Array<InputMaybe<Scalars['Float']['input']>>>;
totalSupplyEnd?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyEnd_gt?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyEnd_gte?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyEnd_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
totalSupplyEnd_lt?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyEnd_lte?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyEnd_not?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyEnd_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
totalSupplyInit?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyInit_gt?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyInit_gte?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyInit_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
totalSupplyInit_lt?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyInit_lte?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyInit_not?: InputMaybe<Scalars['BigInt']['input']>;
totalSupplyInit_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
};
export type PositionsPage = {
__typename?: 'positionsPage';
items: Array<Positions>;
pageInfo: PageInfo;
totalCount: Scalars['Int']['output'];
};
export type Stats = {
__typename?: 'Stats';
activeSupply: Scalars['BigInt']['output'];
id: Scalars['Bytes']['output'];
outstandingSupply: Scalars['BigInt']['output'];
__typename?: 'stats';
burnNextHourProjected: Scalars['BigInt']['output'];
burnedLastDay: Scalars['BigInt']['output'];
burnedLastWeek: Scalars['BigInt']['output'];
id: Scalars['String']['output'];
kraikenTotalSupply: Scalars['BigInt']['output'];
lastHourlyUpdateTimestamp: Scalars['BigInt']['output'];
mintNextHourProjected: Scalars['BigInt']['output'];
mintedLastDay: Scalars['BigInt']['output'];
mintedLastWeek: Scalars['BigInt']['output'];
outstandingStake: Scalars['BigInt']['output'];
ringBuffer: Scalars['JSON']['output'];
ringBufferPointer: Scalars['Int']['output'];
stakeTotalSupply: Scalars['BigInt']['output'];
taxPaidLastDay: Scalars['BigInt']['output'];
taxPaidLastWeek: Scalars['BigInt']['output'];
taxPaidNextHourProjected: Scalars['BigInt']['output'];
totalBurned: Scalars['BigInt']['output'];
totalMinted: Scalars['BigInt']['output'];
totalTaxPaid: Scalars['BigInt']['output'];
totalUbiClaimed: Scalars['BigInt']['output'];
ubiClaimedLastDay: Scalars['BigInt']['output'];
ubiClaimedLastWeek: Scalars['BigInt']['output'];
ubiClaimedNextHourProjected: Scalars['BigInt']['output'];
};
export type GetPositionsQueryVariables = Exact<{ [key: string]: never; }>;
export type StatsFilter = {
AND?: InputMaybe<Array<InputMaybe<StatsFilter>>>;
OR?: InputMaybe<Array<InputMaybe<StatsFilter>>>;
burnNextHourProjected?: InputMaybe<Scalars['BigInt']['input']>;
burnNextHourProjected_gt?: InputMaybe<Scalars['BigInt']['input']>;
burnNextHourProjected_gte?: InputMaybe<Scalars['BigInt']['input']>;
burnNextHourProjected_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
burnNextHourProjected_lt?: InputMaybe<Scalars['BigInt']['input']>;
burnNextHourProjected_lte?: InputMaybe<Scalars['BigInt']['input']>;
burnNextHourProjected_not?: InputMaybe<Scalars['BigInt']['input']>;
burnNextHourProjected_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
burnedLastDay?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastDay_gt?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastDay_gte?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastDay_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
burnedLastDay_lt?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastDay_lte?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastDay_not?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastDay_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
burnedLastWeek?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastWeek_gt?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastWeek_gte?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastWeek_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
burnedLastWeek_lt?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastWeek_lte?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastWeek_not?: InputMaybe<Scalars['BigInt']['input']>;
burnedLastWeek_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
id?: InputMaybe<Scalars['String']['input']>;
id_contains?: InputMaybe<Scalars['String']['input']>;
id_ends_with?: InputMaybe<Scalars['String']['input']>;
id_in?: InputMaybe<Array<InputMaybe<Scalars['String']['input']>>>;
id_not?: InputMaybe<Scalars['String']['input']>;
id_not_contains?: InputMaybe<Scalars['String']['input']>;
id_not_ends_with?: InputMaybe<Scalars['String']['input']>;
id_not_in?: InputMaybe<Array<InputMaybe<Scalars['String']['input']>>>;
id_not_starts_with?: InputMaybe<Scalars['String']['input']>;
id_starts_with?: InputMaybe<Scalars['String']['input']>;
kraikenTotalSupply?: InputMaybe<Scalars['BigInt']['input']>;
kraikenTotalSupply_gt?: InputMaybe<Scalars['BigInt']['input']>;
kraikenTotalSupply_gte?: InputMaybe<Scalars['BigInt']['input']>;
kraikenTotalSupply_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
kraikenTotalSupply_lt?: InputMaybe<Scalars['BigInt']['input']>;
kraikenTotalSupply_lte?: InputMaybe<Scalars['BigInt']['input']>;
kraikenTotalSupply_not?: InputMaybe<Scalars['BigInt']['input']>;
kraikenTotalSupply_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
lastHourlyUpdateTimestamp?: InputMaybe<Scalars['BigInt']['input']>;
lastHourlyUpdateTimestamp_gt?: InputMaybe<Scalars['BigInt']['input']>;
lastHourlyUpdateTimestamp_gte?: InputMaybe<Scalars['BigInt']['input']>;
lastHourlyUpdateTimestamp_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
lastHourlyUpdateTimestamp_lt?: InputMaybe<Scalars['BigInt']['input']>;
lastHourlyUpdateTimestamp_lte?: InputMaybe<Scalars['BigInt']['input']>;
lastHourlyUpdateTimestamp_not?: InputMaybe<Scalars['BigInt']['input']>;
lastHourlyUpdateTimestamp_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
mintNextHourProjected?: InputMaybe<Scalars['BigInt']['input']>;
mintNextHourProjected_gt?: InputMaybe<Scalars['BigInt']['input']>;
mintNextHourProjected_gte?: InputMaybe<Scalars['BigInt']['input']>;
mintNextHourProjected_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
mintNextHourProjected_lt?: InputMaybe<Scalars['BigInt']['input']>;
mintNextHourProjected_lte?: InputMaybe<Scalars['BigInt']['input']>;
mintNextHourProjected_not?: InputMaybe<Scalars['BigInt']['input']>;
mintNextHourProjected_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
mintedLastDay?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastDay_gt?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastDay_gte?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastDay_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
mintedLastDay_lt?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastDay_lte?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastDay_not?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastDay_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
mintedLastWeek?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastWeek_gt?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastWeek_gte?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastWeek_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
mintedLastWeek_lt?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastWeek_lte?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastWeek_not?: InputMaybe<Scalars['BigInt']['input']>;
mintedLastWeek_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
outstandingStake?: InputMaybe<Scalars['BigInt']['input']>;
outstandingStake_gt?: InputMaybe<Scalars['BigInt']['input']>;
outstandingStake_gte?: InputMaybe<Scalars['BigInt']['input']>;
outstandingStake_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
outstandingStake_lt?: InputMaybe<Scalars['BigInt']['input']>;
outstandingStake_lte?: InputMaybe<Scalars['BigInt']['input']>;
outstandingStake_not?: InputMaybe<Scalars['BigInt']['input']>;
outstandingStake_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
ringBufferPointer?: InputMaybe<Scalars['Int']['input']>;
ringBufferPointer_gt?: InputMaybe<Scalars['Int']['input']>;
ringBufferPointer_gte?: InputMaybe<Scalars['Int']['input']>;
ringBufferPointer_in?: InputMaybe<Array<InputMaybe<Scalars['Int']['input']>>>;
ringBufferPointer_lt?: InputMaybe<Scalars['Int']['input']>;
ringBufferPointer_lte?: InputMaybe<Scalars['Int']['input']>;
ringBufferPointer_not?: InputMaybe<Scalars['Int']['input']>;
ringBufferPointer_not_in?: InputMaybe<Array<InputMaybe<Scalars['Int']['input']>>>;
stakeTotalSupply?: InputMaybe<Scalars['BigInt']['input']>;
stakeTotalSupply_gt?: InputMaybe<Scalars['BigInt']['input']>;
stakeTotalSupply_gte?: InputMaybe<Scalars['BigInt']['input']>;
stakeTotalSupply_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
stakeTotalSupply_lt?: InputMaybe<Scalars['BigInt']['input']>;
stakeTotalSupply_lte?: InputMaybe<Scalars['BigInt']['input']>;
stakeTotalSupply_not?: InputMaybe<Scalars['BigInt']['input']>;
stakeTotalSupply_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
taxPaidLastDay?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastDay_gt?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastDay_gte?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastDay_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
taxPaidLastDay_lt?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastDay_lte?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastDay_not?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastDay_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
taxPaidLastWeek?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastWeek_gt?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastWeek_gte?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastWeek_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
taxPaidLastWeek_lt?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastWeek_lte?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastWeek_not?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidLastWeek_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
taxPaidNextHourProjected?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidNextHourProjected_gt?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidNextHourProjected_gte?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidNextHourProjected_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
taxPaidNextHourProjected_lt?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidNextHourProjected_lte?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidNextHourProjected_not?: InputMaybe<Scalars['BigInt']['input']>;
taxPaidNextHourProjected_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
totalBurned?: InputMaybe<Scalars['BigInt']['input']>;
totalBurned_gt?: InputMaybe<Scalars['BigInt']['input']>;
totalBurned_gte?: InputMaybe<Scalars['BigInt']['input']>;
totalBurned_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
totalBurned_lt?: InputMaybe<Scalars['BigInt']['input']>;
totalBurned_lte?: InputMaybe<Scalars['BigInt']['input']>;
totalBurned_not?: InputMaybe<Scalars['BigInt']['input']>;
totalBurned_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
totalMinted?: InputMaybe<Scalars['BigInt']['input']>;
totalMinted_gt?: InputMaybe<Scalars['BigInt']['input']>;
totalMinted_gte?: InputMaybe<Scalars['BigInt']['input']>;
totalMinted_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
totalMinted_lt?: InputMaybe<Scalars['BigInt']['input']>;
totalMinted_lte?: InputMaybe<Scalars['BigInt']['input']>;
totalMinted_not?: InputMaybe<Scalars['BigInt']['input']>;
totalMinted_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
totalTaxPaid?: InputMaybe<Scalars['BigInt']['input']>;
totalTaxPaid_gt?: InputMaybe<Scalars['BigInt']['input']>;
totalTaxPaid_gte?: InputMaybe<Scalars['BigInt']['input']>;
totalTaxPaid_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
totalTaxPaid_lt?: InputMaybe<Scalars['BigInt']['input']>;
totalTaxPaid_lte?: InputMaybe<Scalars['BigInt']['input']>;
totalTaxPaid_not?: InputMaybe<Scalars['BigInt']['input']>;
totalTaxPaid_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
totalUbiClaimed?: InputMaybe<Scalars['BigInt']['input']>;
totalUbiClaimed_gt?: InputMaybe<Scalars['BigInt']['input']>;
totalUbiClaimed_gte?: InputMaybe<Scalars['BigInt']['input']>;
totalUbiClaimed_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
totalUbiClaimed_lt?: InputMaybe<Scalars['BigInt']['input']>;
totalUbiClaimed_lte?: InputMaybe<Scalars['BigInt']['input']>;
totalUbiClaimed_not?: InputMaybe<Scalars['BigInt']['input']>;
totalUbiClaimed_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
ubiClaimedLastDay?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastDay_gt?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastDay_gte?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastDay_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
ubiClaimedLastDay_lt?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastDay_lte?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastDay_not?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastDay_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
ubiClaimedLastWeek?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastWeek_gt?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastWeek_gte?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastWeek_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
ubiClaimedLastWeek_lt?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastWeek_lte?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastWeek_not?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedLastWeek_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
ubiClaimedNextHourProjected?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedNextHourProjected_gt?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedNextHourProjected_gte?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedNextHourProjected_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
ubiClaimedNextHourProjected_lt?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedNextHourProjected_lte?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedNextHourProjected_not?: InputMaybe<Scalars['BigInt']['input']>;
ubiClaimedNextHourProjected_not_in?: InputMaybe<Array<InputMaybe<Scalars['BigInt']['input']>>>;
};
export type GetPositionsQuery = { __typename?: 'Query', positions?: Array<{ __typename?: 'Position', id: any, owner: any, share: any, creationTime: number, lastTaxTime?: number | null, taxRate: any, status: PositionStatus } | null> | null };
export type StatsPage = {
__typename?: 'statsPage';
items: Array<Stats>;
pageInfo: PageInfo;
totalCount: Scalars['Int']['output'];
};

View file

@ -19,7 +19,7 @@ Vue 3 + Vite application that delivers the public marketing site and forthcoming
- Motion relies on CSS transitions to keep runtime costs low
## Development Workflow
- Boot the full stack with `nohup ./scripts/local_env.sh start &`; the script handles landing dev server wiring.
- Boot the full stack with `nohup ./scripts/dev.sh start &`; the script handles landing dev server wiring.
- Use local package scripts for targeted tasks:
- `npm install`
- `npm run build`

View file

@ -9,6 +9,12 @@ services:
expose:
- "8545"
restart: unless-stopped
healthcheck:
test: ["CMD", "cast", "block-number", "--rpc-url", "http://localhost:8545"]
interval: 2s
timeout: 1s
retries: 5
start_period: 5s
postgres:
image: docker.io/library/postgres:16-alpine
@ -37,10 +43,15 @@ services:
- ANVIL_RPC=http://anvil:8545
depends_on:
anvil:
condition: service_started
condition: service_healthy
postgres:
condition: service_healthy
restart: "no"
healthcheck:
test: ["CMD", "test", "-f", "/workspace/tmp/podman/contracts.env"]
interval: 5s
retries: 18
start_period: 10s
ponder:
build:
@ -56,12 +67,20 @@ services:
- CHOKIDAR_USEPOLLING=1
depends_on:
anvil:
condition: service_started
condition: service_healthy
postgres:
condition: service_healthy
bootstrap:
condition: service_started
expose:
- "42069"
restart: unless-stopped
healthcheck:
test: ["CMD", "wget", "--spider", "-q", "http://localhost:42069/"]
interval: 5s
timeout: 3s
retries: 12
start_period: 20s
webapp:
build:
@ -76,10 +95,16 @@ services:
environment:
- CHOKIDAR_USEPOLLING=1
depends_on:
- anvil
ponder:
condition: service_healthy
expose:
- "5173"
restart: unless-stopped
healthcheck:
test: ["CMD", "wget", "--spider", "-q", "http://localhost:5173/app/"]
interval: 5s
retries: 6
start_period: 10s
landing:
build:
@ -94,10 +119,16 @@ services:
environment:
- CHOKIDAR_USEPOLLING=1
depends_on:
- anvil
ponder:
condition: service_healthy
expose:
- "5174"
restart: unless-stopped
healthcheck:
test: ["CMD", "wget", "--spider", "-q", "http://localhost:5174/"]
interval: 5s
retries: 6
start_period: 10s
txn-bot:
build:
@ -111,11 +142,16 @@ services:
- kraiken-node-modules:/workspace/kraiken-lib/node_modules
working_dir: /workspace
depends_on:
- anvil
- ponder
ponder:
condition: service_healthy
expose:
- "43069"
restart: unless-stopped
healthcheck:
test: ["CMD", "wget", "--spider", "-q", "http://localhost:43069/status"]
interval: 5s
retries: 4
start_period: 10s
caddy:
image: docker.io/library/caddy:2.8
@ -124,17 +160,18 @@ services:
ports:
- "0.0.0.0:8081:80"
depends_on:
anvil:
condition: service_started
ponder:
condition: service_started
webapp:
condition: service_started
condition: service_healthy
landing:
condition: service_started
condition: service_healthy
txn-bot:
condition: service_started
condition: service_healthy
restart: unless-stopped
healthcheck:
test: ["CMD", "wget", "--spider", "-q", "http://localhost:80"]
interval: 2s
retries: 3
start_period: 2s
volumes:
postgres-data:

91
scripts/dev.sh Executable file
View file

@ -0,0 +1,91 @@
#!/usr/bin/env bash
set -euo pipefail
cd "$(dirname "$0")/.."
PID_FILE=/tmp/kraiken-watcher.pid
PROJECT_NAME=${COMPOSE_PROJECT_NAME:-$(basename "$PWD")}
start_stack() {
if [[ -f "$PID_FILE" ]]; then
local existing_pid
existing_pid=$(cat "$PID_FILE")
if kill -0 "$existing_pid" 2>/dev/null; then
echo "Stopping existing kraiken-lib watcher ($existing_pid)..."
kill "$existing_pid" 2>/dev/null || true
wait "$existing_pid" 2>/dev/null || true
fi
rm -f "$PID_FILE"
fi
echo "Building kraiken-lib..."
./scripts/build-kraiken-lib.sh
echo "Starting stack..."
podman-compose up -d
echo "Watching for kraiken-lib changes..."
./scripts/watch-kraiken-lib.sh &
echo $! > "$PID_FILE"
echo ""
echo "[ok] Stack started"
echo " Web App: http://localhost:8081/app/"
echo " GraphQL: http://localhost:8081/graphql"
}
stop_stack() {
if [[ -f "$PID_FILE" ]]; then
local watcher_pid
watcher_pid=$(cat "$PID_FILE")
if kill "$watcher_pid" 2>/dev/null; then
wait "$watcher_pid" 2>/dev/null || true
fi
rm -f "$PID_FILE"
fi
podman-compose down
echo "[ok] Stack stopped"
}
check_health() {
echo "Checking health..."
local services=(anvil postgres ponder webapp landing txn-bot caddy)
for service in "${services[@]}"; do
local container
container=$(podman ps --all \
--filter "label=com.docker.compose.project=${PROJECT_NAME}" \
--filter "label=com.docker.compose.service=${service}" \
--format '{{.Names}}' | head -n1)
if [[ -z "$container" ]]; then
echo " [??] $service (not created)"
continue
fi
if podman healthcheck run "$container" &>/dev/null; then
echo " [ok] $service"
else
echo " [!!] $service"
fi
done
}
usage() {
echo "Usage: $0 {start|stop|health}"
exit 1
}
case "${1:-help}" in
start)
start_stack
;;
stop)
stop_stack
;;
health)
check_health
;;
*)
usage
;;
esac

View file

@ -1,557 +0,0 @@
#!/usr/bin/env bash
set -euo pipefail
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
STATE_DIR="$ROOT_DIR/tmp/local-env"
LOG_DIR="$STATE_DIR/logs"
ANVIL_PID_FILE="$STATE_DIR/anvil.pid"
PONDER_PID_FILE="$STATE_DIR/ponder.pid"
WEBAPP_PID_FILE="$STATE_DIR/webapp.pid"
TXNBOT_PID_FILE="$STATE_DIR/txnBot.pid"
ANVIL_LOG="$LOG_DIR/anvil.log"
PONDER_LOG="$LOG_DIR/ponder.log"
WEBAPP_LOG="$LOG_DIR/webapp.log"
TXNBOT_LOG="$LOG_DIR/txnBot.log"
SETUP_LOG="$LOG_DIR/setup.log"
TXNBOT_ENV_FILE="$STATE_DIR/txnBot.env"
MNEMONIC_FILE="$ROOT_DIR/onchain/.secret.local"
FORK_URL=${FORK_URL:-"https://sepolia.base.org"}
ANVIL_RPC="http://127.0.0.1:8545"
GRAPHQL_HEALTH="http://127.0.0.1:42069/health"
GRAPHQL_ENDPOINT="http://127.0.0.1:42069/graphql"
FRONTEND_URL="http://127.0.0.1:5173"
LOCAL_TXNBOT_URL="http://127.0.0.1:43069"
FOUNDRY_BIN=${FOUNDRY_BIN:-"$HOME/.foundry/bin"}
FORGE="$FOUNDRY_BIN/forge"
CAST="$FOUNDRY_BIN/cast"
ANVIL="$FOUNDRY_BIN/anvil"
DEFAULT_DEPLOYER_PK="0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80"
DEFAULT_DEPLOYER_ADDR="0xf39Fd6e51aad88F6F4ce6aB8827279cffFb92266"
DEPLOYER_PK=${DEPLOYER_PK:-$DEFAULT_DEPLOYER_PK}
DEPLOYER_ADDR=${DEPLOYER_ADDR:-$DEFAULT_DEPLOYER_ADDR}
FEE_DEST="0xf6a3eef9088A255c32b6aD2025f83E57291D9011"
WETH="0x4200000000000000000000000000000000000006"
SWAP_ROUTER="0x94cC0AaC535CCDB3C01d6787D6413C739ae12bc4"
DEFAULT_TXNBOT_PRIVATE_KEY="0x59c6995e998f97a5a0044966f0945389dc9e86dae88c7a8412f4603b6b78690d"
DEFAULT_TXNBOT_ADDRESS="0x70997970C51812dc3A010C7d01b50e0d17dc79C8"
TXNBOT_PRIVATE_KEY=${TXNBOT_PRIVATE_KEY:-$DEFAULT_TXNBOT_PRIVATE_KEY}
TXNBOT_ADDRESS=${TXNBOT_ADDRESS:-$DEFAULT_TXNBOT_ADDRESS}
TXNBOT_FUND_VALUE=${TXNBOT_FUND_VALUE:-1ether}
MAX_UINT="0xffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff"
SKIP_CLEANUP=false
COMMAND="start"
cleanup() {
stop_process "Frontend" "$WEBAPP_PID_FILE"
stop_process "Ponder" "$PONDER_PID_FILE"
stop_process "TxnBot" "$TXNBOT_PID_FILE"
stop_process "Anvil" "$ANVIL_PID_FILE"
if [[ "$SKIP_CLEANUP" == true ]]; then
log "Skipping state cleanup (--no-cleanup). State preserved at $STATE_DIR"
return
fi
if [[ -d "$STATE_DIR" ]]; then
rm -rf "$STATE_DIR"
fi
}
stop_process() {
local name="$1"
local pid_file="$2"
if [[ -f "$pid_file" ]]; then
local pid
pid="$(cat "$pid_file")"
if kill -0 "$pid" 2>/dev/null; then
echo "[local-env] Stopping $name (pid $pid)"
kill "$pid" 2>/dev/null || true
wait "$pid" 2>/dev/null || true
fi
rm -f "$pid_file"
fi
}
log() {
echo "[local-env] $*"
}
wait_for_rpc() {
local url="$1"
for _ in {1..60}; do
if curl --max-time 1 --connect-timeout 1 -s -o /dev/null -X POST "$url" -H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"eth_chainId","params":[]}'; then
return 0
fi
sleep 1
done
log "Timed out waiting for RPC at $url"
return 1
}
wait_for_http() {
local url="$1"
for _ in {1..60}; do
if curl --max-time 1 --connect-timeout 1 -sSf "$url" >/dev/null 2>&1; then
return 0
fi
sleep 1
done
log "Timed out waiting for $url"
return 1
}
set_deployer_credentials() {
if [[ -n "$DEPLOYER_PK" && -n "$DEPLOYER_ADDR" ]]; then
return
fi
if [[ -f "$MNEMONIC_FILE" ]]; then
local mnemonic
mnemonic="$(tr -d '\n\r' < "$MNEMONIC_FILE")"
if [[ -n "$mnemonic" ]]; then
local derived_pk
derived_pk="$("$CAST" wallet private-key --mnemonic "$mnemonic" \
--mnemonic-derivation-path "m/44'/60'/0'/0/0")"
local derived_addr
derived_addr="$("$CAST" wallet address --private-key "$derived_pk")"
DEPLOYER_PK=${DEPLOYER_PK:-$derived_pk}
DEPLOYER_ADDR=${DEPLOYER_ADDR:-$derived_addr}
fi
fi
DEPLOYER_PK=${DEPLOYER_PK:-$DEFAULT_DEPLOYER_PK}
DEPLOYER_ADDR=${DEPLOYER_ADDR:-$DEFAULT_DEPLOYER_ADDR}
}
ensure_tools() {
for cmd in "$ANVIL" "$FORGE" "$CAST" jq npm curl; do
if ! command -v "$cmd" >/dev/null 2>&1; then
log "Required command not found: $cmd"
exit 1
fi
done
}
start_directories() {
mkdir -p "$LOG_DIR"
}
ensure_dependencies() {
if [[ ! -d "$ROOT_DIR/services/ponder/node_modules" ]]; then
log "Installing ponder dependencies"
pushd "$ROOT_DIR/services/ponder" >/dev/null
npm install >>"$SETUP_LOG" 2>&1
popd >/dev/null
fi
if [[ ! -d "$ROOT_DIR/web-app/node_modules" ]]; then
log "Installing web-app dependencies"
pushd "$ROOT_DIR/web-app" >/dev/null
npm install >>"$SETUP_LOG" 2>&1
popd >/dev/null
fi
if [[ ! -d "$ROOT_DIR/services/txnBot/node_modules" ]]; then
log "Installing txnBot dependencies"
pushd "$ROOT_DIR/services/txnBot" >/dev/null
npm install >>"$SETUP_LOG" 2>&1
popd >/dev/null
fi
}
start_anvil() {
if [[ -f "$ANVIL_PID_FILE" ]]; then
local existing_pid
existing_pid="$(cat "$ANVIL_PID_FILE")"
if kill -0 "$existing_pid" 2>/dev/null; then
log "Anvil already running (pid $existing_pid)"
return
fi
log "Found stale Anvil pid file (pid $existing_pid); restarting Anvil"
rm -f "$ANVIL_PID_FILE"
fi
log "Starting Anvil (forking $FORK_URL)"
local anvil_args=("--fork-url" "$FORK_URL" "--chain-id" 31337 "--block-time" 1 \
"--host" 127.0.0.1 "--port" 8545 "--threads" 4 "--timeout" 2000 "--retries" 2 \
"--fork-retry-backoff" 100)
if [[ -f "$MNEMONIC_FILE" ]]; then
local mnemonic
mnemonic="$(tr -d '\n\r' < "$MNEMONIC_FILE")"
if [[ -n "$mnemonic" ]]; then
anvil_args+=("--mnemonic" "$mnemonic")
fi
fi
"$ANVIL" "${anvil_args[@]}" >"$ANVIL_LOG" 2>&1 &
echo $! >"$ANVIL_PID_FILE"
wait_for_rpc "$ANVIL_RPC"
}
deploy_protocol() {
log "Deploying contracts"
pushd "$ROOT_DIR/onchain" >/dev/null
"$FORGE" script script/DeployLocal.sol --fork-url "$ANVIL_RPC" --broadcast \
>>"$SETUP_LOG" 2>&1
popd >/dev/null
}
extract_addresses() {
local run_file
run_file="$(ls -t "$ROOT_DIR/onchain/broadcast/DeployLocal.sol"/*/run-latest.json 2>/dev/null | head -n1)"
if [[ -z "$run_file" || ! -f "$run_file" ]]; then
log "Deployment artifact not found under onchain/broadcast/DeployLocal.sol"
exit 1
fi
LIQUIDITY_MANAGER="$(jq -r '.transactions[] | select(.contractName=="LiquidityManager") | .contractAddress' "$run_file" | head -n1)"
KRAIKEN="$(jq -r '.transactions[] | select(.contractName=="Kraiken") | .contractAddress' "$run_file" | head -n1)"
STAKE="$(jq -r '.transactions[] | select(.contractName=="Stake") | .contractAddress' "$run_file" | head -n1)"
log "Using deployment artifact: ${run_file#$ROOT_DIR/}"
if [[ -z "$LIQUIDITY_MANAGER" || "$LIQUIDITY_MANAGER" == "null" ]]; then
log "Failed to extract LiquidityManager address"
exit 1
fi
echo "LIQUIDITY_MANAGER=$LIQUIDITY_MANAGER" >"$STATE_DIR/contracts.env"
echo "KRAIKEN=$KRAIKEN" >>"$STATE_DIR/contracts.env"
echo "STAKE=$STAKE" >>"$STATE_DIR/contracts.env"
# Get deployment block number
local deploy_block
deploy_block="$(jq -r '.receipts[0].blockNumber' "$run_file" | xargs printf "%d")"
# Create .env.local for Ponder with deployed addresses
cat > "$ROOT_DIR/services/ponder/.env.local" <<EOF
# Auto-generated by local_env.sh
PONDER_NETWORK=BASE_SEPOLIA_LOCAL_FORK
KRAIKEN_ADDRESS=$KRAIKEN
STAKE_ADDRESS=$STAKE
START_BLOCK=$deploy_block
# Use PostgreSQL connection
DATABASE_URL=postgresql://ponder:ponder_local@localhost/ponder_local
DATABASE_SCHEMA=ponder_local_${deploy_block}
EOF
log "Created Ponder .env.local with deployed addresses and block $deploy_block"
}
source_addresses() {
if [[ ! -f "$STATE_DIR/contracts.env" ]]; then
return 1
fi
# shellcheck disable=SC1090
source "$STATE_DIR/contracts.env"
}
bootstrap_liquidity_manager() {
source_addresses || { log "Contract addresses not found"; exit 1; }
log "Funding LiquidityManager"
"$CAST" send --rpc-url "$ANVIL_RPC" --private-key "$DEPLOYER_PK" \
"$LIQUIDITY_MANAGER" --value 0.1ether >>"$SETUP_LOG" 2>&1
log "Granting recenter access"
"$CAST" rpc --rpc-url "$ANVIL_RPC" anvil_impersonateAccount "$FEE_DEST" >/dev/null
"$CAST" send --rpc-url "$ANVIL_RPC" --from "$FEE_DEST" --unlocked \
"$LIQUIDITY_MANAGER" "setRecenterAccess(address)" "$DEPLOYER_ADDR" >>"$SETUP_LOG" 2>&1
"$CAST" rpc --rpc-url "$ANVIL_RPC" anvil_stopImpersonatingAccount "$FEE_DEST" >/dev/null
log "Calling recenter()"
"$CAST" send --rpc-url "$ANVIL_RPC" --private-key "$DEPLOYER_PK" \
"$LIQUIDITY_MANAGER" "recenter()" >>"$SETUP_LOG" 2>&1
if [[ -n "$TXNBOT_ADDRESS" ]]; then
log "Transferring recenter access to txnBot $TXNBOT_ADDRESS"
"$CAST" rpc --rpc-url "$ANVIL_RPC" anvil_impersonateAccount "$FEE_DEST" >/dev/null
"$CAST" send --rpc-url "$ANVIL_RPC" --from "$FEE_DEST" --unlocked \
"$LIQUIDITY_MANAGER" "setRecenterAccess(address)" "$TXNBOT_ADDRESS" >>"$SETUP_LOG" 2>&1
"$CAST" rpc --rpc-url "$ANVIL_RPC" anvil_stopImpersonatingAccount "$FEE_DEST" >/dev/null
else
log "TXNBOT_ADDRESS not set; recenter access left with deployer"
fi
}
prepare_application_state() {
source_addresses || { log "Contract addresses not found"; exit 1; }
log "Wrapping 0.02 ETH into WETH"
"$CAST" send --rpc-url "$ANVIL_RPC" --private-key "$DEPLOYER_PK" \
"$WETH" "deposit()" --value 0.02ether >>"$SETUP_LOG" 2>&1
log "Approving SwapRouter" \
&& "$CAST" send --rpc-url "$ANVIL_RPC" --private-key "$DEPLOYER_PK" \
"$WETH" "approve(address,uint256)" "$SWAP_ROUTER" "$MAX_UINT" >>"$SETUP_LOG" 2>&1
log "Executing initial KRK buy"
"$CAST" send --legacy --gas-limit 300000 --rpc-url "$ANVIL_RPC" --private-key "$DEPLOYER_PK" \
"$SWAP_ROUTER" "exactInputSingle((address,address,uint24,address,uint256,uint256,uint160))" \
"($WETH,$KRAIKEN,10000,$DEPLOYER_ADDR,10000000000000000,0,0)" >>"$SETUP_LOG" 2>&1
}
prime_chain_for_indexing() {
log "Pre-mining blocks for indexer stability"
for _ in {1..2000}; do
"$CAST" rpc --rpc-url "$ANVIL_RPC" evm_mine > /dev/null 2>&1 || true
done
}
write_txnbot_env() {
source_addresses || { log "Contract addresses not found"; exit 1; }
cat > "$TXNBOT_ENV_FILE" <<EOF
ENVIRONMENT=BASE_SEPOLIA_LOCAL_FORK
PROVIDER_URL=$ANVIL_RPC
PRIVATE_KEY=$TXNBOT_PRIVATE_KEY
LM_CONTRACT_ADDRESS=$LIQUIDITY_MANAGER
STAKE_CONTRACT_ADDRESS=$STAKE
GRAPHQL_ENDPOINT=$GRAPHQL_ENDPOINT
WALLET_ADDRESS=$TXNBOT_ADDRESS
# Expose a non-conflicting status port for local dev
PORT=43069
EOF
}
fund_txnbot_wallet() {
source_addresses || { log "Contract addresses not found"; exit 1; }
if [[ -z "$TXNBOT_ADDRESS" ]]; then
log "TxnBot wallet address not provided; skipping funding"
return
fi
log "Funding txnBot wallet $TXNBOT_ADDRESS with $TXNBOT_FUND_VALUE"
"$CAST" send --rpc-url "$ANVIL_RPC" --private-key "$DEPLOYER_PK" \
"$TXNBOT_ADDRESS" --value "$TXNBOT_FUND_VALUE" >>"$SETUP_LOG" 2>&1 || \
log "Funding txnBot wallet failed (see setup log)"
# Ensure local fork balance reflects the funding even if the upstream state is zero
local fund_value_wei
fund_value_wei="$("$CAST" --to-unit "$TXNBOT_FUND_VALUE" wei)"
local fund_value_hex
fund_value_hex="$("$CAST" --to-hex "$fund_value_wei")"
"$CAST" rpc --rpc-url "$ANVIL_RPC" anvil_setBalance "$TXNBOT_ADDRESS" "$fund_value_hex" \
>>"$SETUP_LOG" 2>&1
}
start_txnbot() {
if [[ -f "$TXNBOT_PID_FILE" ]]; then
log "txnBot already running (pid $(cat \"$TXNBOT_PID_FILE\"))"
return
fi
write_txnbot_env
fund_txnbot_wallet
log "Starting txnBot automation"
pushd "$ROOT_DIR/services/txnBot" >/dev/null
TXN_BOT_ENV_FILE="$TXNBOT_ENV_FILE" node service.js >"$TXNBOT_LOG" 2>&1 &
popd >/dev/null
echo $! >"$TXNBOT_PID_FILE"
}
start_ponder() {
if [[ -f "$PONDER_PID_FILE" ]]; then
log "Ponder already running (pid $(cat "$PONDER_PID_FILE"))"
return
fi
log "Starting Ponder indexer"
pushd "$ROOT_DIR/services/ponder" >/dev/null
PONDER_NETWORK=BASE_SEPOLIA_LOCAL_FORK npm run dev >"$PONDER_LOG" 2>&1 &
popd >/dev/null
echo $! >"$PONDER_PID_FILE"
wait_for_http "$GRAPHQL_HEALTH"
}
start_frontend() {
if [[ -f "$WEBAPP_PID_FILE" ]]; then
log "Frontend already running (pid $(cat \"$WEBAPP_PID_FILE\"))"
return
fi
source_addresses || { log "Contract addresses not found"; exit 1; }
local vite_env=(
"VITE_DEFAULT_CHAIN_ID=31337"
"VITE_LOCAL_RPC_URL=/rpc/anvil"
"VITE_LOCAL_RPC_PROXY_TARGET=$ANVIL_RPC"
"VITE_KRAIKEN_ADDRESS=$KRAIKEN"
"VITE_STAKE_ADDRESS=$STAKE"
"VITE_LIQUIDITY_MANAGER=$LIQUIDITY_MANAGER"
"VITE_SWAP_ROUTER=$SWAP_ROUTER"
"VITE_PONDER_BASE_SEPOLIA_LOCAL_FORK=$GRAPHQL_ENDPOINT"
"VITE_TXNBOT_BASE_SEPOLIA_LOCAL_FORK=$LOCAL_TXNBOT_URL"
"VITE_ENABLE_EVENT_STREAM=false"
"VITE_POSITIONS_POLL_MS=0"
)
log "Starting frontend (Vite dev server)"
pushd "$ROOT_DIR/web-app" >/dev/null
env "${vite_env[@]}" npm run dev -- --host 0.0.0.0 --port 5173 >"$WEBAPP_LOG" 2>&1 &
popd >/dev/null
echo $! >"$WEBAPP_PID_FILE"
wait_for_http "$FRONTEND_URL"
}
start_environment() {
ensure_tools
set_deployer_credentials
start_directories
ensure_dependencies
start_anvil
deploy_protocol
extract_addresses
bootstrap_liquidity_manager
prepare_application_state
prime_chain_for_indexing
start_ponder # Re-enabled with PostgreSQL
start_txnbot
start_frontend
}
start_and_wait() {
start_environment
source_addresses || true
log "Environment ready"
log " RPC: $ANVIL_RPC"
log " Kraiken: $KRAIKEN"
log " Stake: $STAKE"
log " LM: $LIQUIDITY_MANAGER"
log " Indexer: $GRAPHQL_HEALTH"
log " Frontend: $FRONTEND_URL"
log "Press Ctrl+C to shut everything down"
wait "$(cat "$ANVIL_PID_FILE")" "$(cat "$PONDER_PID_FILE")" "$(cat "$WEBAPP_PID_FILE")" "$(cat "$TXNBOT_PID_FILE")"
}
stop_environment() {
local previous_skip="$SKIP_CLEANUP"
SKIP_CLEANUP=false
cleanup
SKIP_CLEANUP="$previous_skip"
log "Environment stopped"
}
on_exit() {
local status=$?
if [[ "$COMMAND" != "start" ]]; then
return
fi
if [[ $status -ne 0 ]]; then
log "Start command exited with status $status"
if [[ -f "$SETUP_LOG" ]]; then
log "Last 40 lines of setup log:"
tail -n 40 "$SETUP_LOG"
fi
fi
cleanup
if [[ "$SKIP_CLEANUP" == true ]]; then
log "State directory preserved at $STATE_DIR"
else
log "Environment stopped"
fi
}
usage() {
cat <<EOF
Usage: scripts/local_env.sh [--no-cleanup] [start|stop|status]
Commands:
start Bootstrap local chain, contracts, indexer, and frontend (default)
stop Terminate all background services
status Show running process information
Options:
--no-cleanup Preserve logs and state directory on exit (start command only)
EOF
}
status_environment() {
source_addresses 2>/dev/null || true
printf '\n%-12s %-8s %-10s %s\n' "Service" "Status" "PID" "Details"
printf '%s\n' "-------------------------------------------------------------"
service_status "Anvil" "$ANVIL_PID_FILE" "$ANVIL_LOG"
service_status "Ponder" "$PONDER_PID_FILE" "$PONDER_LOG"
service_status "Frontend" "$WEBAPP_PID_FILE" "$WEBAPP_LOG"
service_status "TxnBot" "$TXNBOT_PID_FILE" "$TXNBOT_LOG"
if [[ -f "$STATE_DIR/contracts.env" ]]; then
printf '\nContracts:\n'
cat "$STATE_DIR/contracts.env"
printf '\n'
fi
}
service_status() {
local name="$1" pid_file="$2" log_file="$3"
if [[ -f "$pid_file" ]]; then
local pid="$(cat "$pid_file")"
if kill -0 "$pid" 2>/dev/null; then
printf '%-12s %-8s %-10s %s\n' "$name" "running" "$pid" "$log_file"
return
fi
fi
printf '%-12s %-8s %-10s %s\n' "$name" "stopped" "-" "$log_file"
}
COMMAND="start"
while (($#)); do
case "$1" in
--no-cleanup)
SKIP_CLEANUP=true
;;
start|stop|status)
COMMAND="$1"
;;
*)
usage
exit 1
;;
esac
shift
done
if [[ "$COMMAND" == "start" ]]; then
trap on_exit EXIT
fi
case "$COMMAND" in
start)
trap 'log "Caught signal, stopping..."; exit 0' INT TERM
start_and_wait
;;
stop)
stop_environment
;;
status)
status_environment
;;
*)
usage
exit 1
;;
esac

View file

@ -14,7 +14,7 @@ Ponder-based indexer that records Kraiken protocol activity and exposes the Grap
- `.ponder/` - Local SQLite/state cache (safe to delete when schemas change).
## Development Workflow
- Primary path: `nohup ./scripts/local_env.sh start &` boots Anvil, deploys contracts, and launches Ponder in watch mode.
- Primary path: `nohup ./scripts/dev.sh start &` boots Anvil, deploys contracts, and launches Ponder in watch mode.
- Podman stack: `podman-compose up -d` starts all services including PostgreSQL; bootstrap creates `.env.local` automatically.
- Focused debugging: within `services/ponder/`, run `npm install` then `PONDER_NETWORK=BASE_SEPOLIA_LOCAL_FORK npm run dev` once the stack is already online.
- For production-style runs, use `npm run build` followed by `PONDER_NETWORK=BASE npm run start` and point `DATABASE_URL` to PostgreSQL if persistence is required.

View file

@ -26,7 +26,7 @@ cp .env.example .env
```
Edit `.env` to select your network:
- `PONDER_NETWORK=BASE_SEPOLIA_LOCAL_FORK` - Local Anvil fork managed by `scripts/local_env.sh`
- `PONDER_NETWORK=BASE_SEPOLIA_LOCAL_FORK` - Local Anvil fork managed by `scripts/dev.sh`
- `PONDER_NETWORK=BASE_SEPOLIA` - Base Sepolia testnet
- `PONDER_NETWORK=BASE` - Base mainnet

View file

@ -27,7 +27,7 @@ GRAPHQL_ENDPOINT=<ponder-graphql-url>
```
## Operations
- Launch via `nohup ./scripts/local_env.sh start &`; the script handles env injection and process supervision.
- Launch via `nohup ./scripts/dev.sh start &`; the script handles env injection and process supervision.
- For focused debugging, run `npm install` then `npm start` within `services/txnBot/` after the stack is already up.
- Monitor logs for transaction receipts and ensure gas pricing stays within acceptable bounds.

View file

@ -6,7 +6,7 @@ Automation worker that monitors staking positions and calls `recenter()` / `payT
The bot supports three environments shared across the stack:
- `BASE_SEPOLIA_LOCAL_FORK` Anvil fork started by `scripts/local_env.sh`
- `BASE_SEPOLIA_LOCAL_FORK` Anvil fork started by `scripts/dev.sh`
- `BASE_SEPOLIA` Public Base Sepolia testnet
- `BASE` Base mainnet
@ -26,7 +26,7 @@ GRAPHQL_ENDPOINT=<ponder-graphql-url>
# Optional: PORT=43069
```
`scripts/local_env.sh start` generates these values automatically for the local fork, writes them to a temporary file, and keeps the process running in the background.
`scripts/dev.sh start` generates these values automatically for the local fork, writes them to a temporary file, and keeps the process running in the background.
## Local Run

View file

@ -10,7 +10,7 @@
<f-card>
<div class="cheats-form">
<f-input v-model="rpcUrl" label="RPC URL"></f-input>
<p class="cheats-hint">Defaults to the Anvil instance from <code>scripts/local_env.sh</code>.</p>
<p class="cheats-hint">Defaults to the Anvil instance from <code>scripts/dev.sh</code>.</p>
<div class="cheats-actions">
<f-button
:disabled="!hasWalletProvider || addingNetwork"