diff --git a/AGENTS.md b/AGENTS.md index 0dac92db..5b31f5ae 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -179,6 +179,12 @@ When touching transaction and position flows, validation MUST include all releva 41. **Envio market-detail read integrity**: when Envio backs market-detail participants or activity tables, share-to-asset conversions must use the already-loaded live market state keyed by `chainId + market.uniqueKey` instead of a second indexer totals query; participant caches must not store state-derived converted values behind a key that ignores the live state; event/liquidation tables must fetch only the requested page window with correct merged ordering rather than scanning full market history unless the UI explicitly requires an exact total count; provider fallbacks must page at the provider boundary or fail closed with typed source/network errors instead of fetching full history and slicing client-side or returning empty success on missing subgraph configuration; unknown-total pagination must use an explicit open-ended `hasNextPage` mode instead of synthesizing a moving “last page”; and page transitions in that mode must start from a neutral loading state rather than reusing stale rows from the previous page. 42. **Oracle metadata source integrity**: oracle vendor/type/feed classification must resolve from the scanner metadata source keyed by `chainId + oracleAddress`. Do not reintroduce Morpho API `oracles` feed enrichment into market objects or UI/filter/warning logic as a fallback source for oracle structure. 43. **Mixed oracle badge signal integrity**: when a standard or meta oracle contains both classified feeds and unknown/unverified feeds, vendor badges and their tooltips must preserve both signals together (known vendor icon(s) plus unknown indicator/text) instead of collapsing to only the recognized vendor. +44. **Grouped realized-rate aggregation integrity**: when aggregating realized APY/APR across multiple markets or positions, aggregate raw earned value and capital-time exposure first, then annualize once at the grouped level. Do not weight or average already-annualized per-market realized rates, because dust positions with tiny capital can dominate the result spuriously. +45. **Primary-empty fallback integrity**: when a primary indexer/provider backs market participants, user-position discovery, user history, or similar source-discovery reads, treat an empty primary result as authoritative only when the primary query completed successfully and fully covered the exact requested scope. Empty results from partial pages, coverage-limited endpoints, or lag-prone non-authoritative reads must still fall through to the next provider; empty results from scoped, fully paginated primary reads must not trigger fallback. +46. **External address-filter casing integrity**: when querying external indexers by address string fields (`user`, `onBehalf`, `borrower`, etc.), do not assume case-insensitive matching. Use the backend’s documented canonical normalization, or query safe canonical/exact-case variants of the same address and keep downstream identity/dedup keyed by canonical lowercase `chainId + address` or `chainId + txHash`. +47. **Indexed event-history pagination integrity**: when stitching user history from raw indexer event tables, page each table with a stable unique order (include an event-unique tie-breaker such as indexed row `id`, not just `timestamp` or `txHash`), freeze the page window against new-head inserts during pagination, dedupe API pages by source event identity, and reconcile local receipt caches to indexed history with a cross-source event merge key (`hash + type + market + assets + shares`, or stronger) rather than raw tx-hash presence. +48. **Paged history source consistency integrity**: when a paged transaction/history read falls back from a primary provider, choose the fallback provider once per query and keep it fixed across all pages. Do not mix providers page-by-page, and do not silently return truncated history when page caps are hit; fail closed with an explicit error instead. +49. **Merged-history pagination query integrity**: when a UI paginates over a merged transaction/history dataset that is fetched as a full filtered set (for example because the backend cannot page a merged multi-table stream correctly), keep page changes local to the UI and do not include `skip`/`first` or current-page state in the upstream query key. Fetch once per filter scope, then slice locally, so pagination controls do not re-trigger the full upstream history cycle on every click. ### REQUIRED: Regression Rule Capture diff --git a/docs/TECHNICAL_OVERVIEW.md b/docs/TECHNICAL_OVERVIEW.md index 1e58cdb5..822b6f62 100644 --- a/docs/TECHNICAL_OVERVIEW.md +++ b/docs/TECHNICAL_OVERVIEW.md @@ -195,8 +195,9 @@ Market metrics: Monarch metrics API via `/api/monarch/metrics` |-----------|--------|---------|------------| | Markets list | Morpho API/Subgraph | 5 min stale | `useMarketsQuery` | | Market metrics (flows, trending) | Monarch API | 5 min stale | `useMarketMetricsQuery` | -| Market state (APY, utilization) | Morpho API | 30s stale | `useMarketData` | -| User positions | Morpho API + on-chain | 5 min | `useUserPositions` | +| Market state (APY, utilization) | RPC snapshot + Morpho API/Subgraph | 30s stale | `useMarketData` | +| User positions | Monarch position discovery + on-chain snapshots + market registry from `useProcessedMarkets` | 5 min | `useUserPositions` | +| User transaction history | Monarch GraphQL → Morpho API → Subgraph (`assetIds` queries still skip Monarch) | 60s | `useUserTransactionsQuery` | | Vaults list | Morpho API | 5 min | `useAllMorphoVaultsQuery` | | User autovault metadata | Monarch GraphQL + on-chain enrichment | 60s | `useUserVaultsV2Query` | | Vault detail/settings metadata | Monarch GraphQL + narrow RPC fallback | 30s | `useVaultV2Data` | @@ -204,9 +205,66 @@ Market metrics: Monarch metrics API via `/api/monarch/metrics` | Vault allocations | On-chain multicall | 30s | `useAllocationsQuery` | | Token balances | On-chain multicall | 5 min | `useUserBalancesQuery` | | Oracle metadata | Scanner Gist | 30 min | `useOracleMetadata` / `useAllOracleMetadata` | -| Merkl rewards | Merkl API | On demand | `useMerklCampaignsQuery` | +| User rewards and distributions | Morpho rewards REST + Merkl API | 30s | `useUserRewardsQuery` | +| Reward campaigns | Merkl API | 5 min stale | `useMerklCampaignsQuery` | | Market liquidations | Monarch GraphQL + Morpho API/Subgraph fallback | 5 min stale | `useMarketLiquidations` | -| Admin stats transactions | Monarch GraphQL | 2 min stale | `useMonarchTransactions` | +| Admin stats transactions | Monarch GraphQL + market registry/token price enrichment | 2 min stale | `useMonarchTransactions` | + +### Data Hook Responsibility Matrix + +This is the migration checklist for the Monarch API (Envio GraphQL endpoint). "Full Monarch support" here means the feature would still work if Morpho API and subgraph reads were unavailable. + +Hooks omitted from this matrix are local-state hooks or pure view/composition helpers that do not own remote transport reads. + +#### Core Markets And Positions + +| Hook / Family | Responsibility | Infra Today | Full Monarch Support Still Needs | +|---------------|----------------|-------------|----------------------------------| +| `useMarketsQuery` | Global market registry used across the app | Morpho API first per chain, then subgraph | Monarch market registry and market detail parity | +| `useProcessedMarkets` | Blacklist/filtering layer on top of market registry, plus USD backfill | `useMarketsQuery` + `useTokenPrices` | Inherits `useMarketsQuery`; also needs a Monarch-native token price source if we want to remove Morpho price reads | +| `useMarketData` | Single-market detail shell with freshest live state | RPC snapshot + Morpho API, then subgraph | Monarch single-market metadata/detail path | +| `useMarketHistoricalData` | Historical market chart series | Morpho historical API, then subgraph | Monarch historical market snapshots/timeseries | +| `useTokenPrices` | Token USD price lookup and peg fallback used by markets/admin stats | Morpho price API + major price fallback | Monarch price endpoint or another canonical replacement | +| `useUserPositions` | Discover all markets where a user has positions, then attach live balances | Monarch batched `Position` discovery + RPC snapshots/oracle reads + market metadata from `useProcessedMarkets`; Morpho/Subgraph fallback for discovery | Monarch market registry/detail if position objects should no longer depend on Morpho/Subgraph market metadata | +| `useUserPosition` | Single-market user position | RPC snapshot first; if snapshot unavailable, Monarch position state when local market exists; then Morpho/Subgraph fallback | Same market-registry/detail gap as `useUserPositions` | +| `useUserTransactionsQuery` / `fetchUserTransactions` | User history across one or many chains | Monarch user-event tables first; fallback Morpho API, then subgraph; `assetIds` filter still bypasses Monarch | Asset-address filtered history support to fully back reports and any asset-scoped history views | +| `useUserPositionsSummaryData` | Portfolio earnings summary for active positions | `useUserPositions` + `useUserTransactionsQuery` + RPC block/snapshot helpers | Inherits the remaining `useUserPositions` and `useUserTransactionsQuery` gaps | +| `usePositionReport` | Asset-scoped earnings/report generation | `fetchUserTransactions(assetIds=...)` + RPC block/snapshot helpers | Still blocked on Monarch support for `assetIds`-scoped user history | +| `usePositionHistoryChart` | Derive chart points for one asset/market group | Pure derivation from transactions + snapshots already fetched elsewhere | No backend gap; inherits upstream history/snapshot gaps | + +#### Market Detail And Admin Reads + +| Hook / Family | Responsibility | Infra Today | Full Monarch Support Still Needs | +|---------------|----------------|-------------|----------------------------------| +| `useMarketSuppliers` / `useMarketBorrowers` | Paginated open positions on one market | Monarch first, then Morpho API, then subgraph | Already Monarch-first; no new Envio schema gap identified | +| `useAllMarketSuppliers` / `useAllMarketBorrowers` | Non-paginated top positions for concentration charts | Monarch first, then Morpho API, then subgraph | Already Monarch-first; no new Envio schema gap identified | +| `useMarketSupplies` / `useMarketBorrows` | Paginated supply/withdraw and borrow/repay activity | Monarch first, then Morpho API, then subgraph | Already Monarch-first; no new Envio schema gap identified | +| `useMarketLiquidations` | Paginated liquidations | Monarch first, then Morpho API, then subgraph | Already Monarch-first; no new Envio schema gap identified | +| `useMonarchTransactions` | Admin stats feed and aggregated flow dashboards | Monarch transactions + `useProcessedMarkets` + `useTokenPrices` | If admin stats should be fully independent, Monarch also needs market registry/metadata and a non-Morpho price source | + +#### Vaults And Allocators + +| Hook / Family | Responsibility | Infra Today | Full Monarch Support Still Needs | +|---------------|----------------|-------------|----------------------------------| +| `useUserVaultsV2Query` | User vault list with optional balance, TVL, and yield enrichment | Monarch vault metadata + RPC balances/totalAssets + RPC 4626 yield snapshots | Already off Morpho for yield; no new Envio schema gap identified | +| `useVaultV2Data` | Vault detail/settings metadata for a single vault | Monarch vault detail first, narrow RPC fallback if metadata unavailable | Already aligned with Monarch-first design | +| `useAllMorphoVaultsQuery` | Global whitelisted vault registry | Morpho API only | Monarch/public vault registry parity | +| `usePublicAllocatorVaults` | Public allocator config for supplying vaults in a market | Morpho API only | Monarch/public allocator config endpoint parity | +| `useAllocationsQuery` | Live vault `allocation(capId)` values | Pure RPC multicall | No Envio gap | +| `usePublicAllocatorLiveData` | Live flow caps, vault supply, and liquidity for allocator UX | Pure RPC multicall | No Envio gap | +| `useVaultHistoricalApy` / `use4626VaultAPR` | Historical 4626 yield and expected carry calculations | Pure RPC share-price snapshots + RPC Morpho market reads | No Envio gap | + +#### RPC Helpers And External Reads + +| Hook / Family | Responsibility | Infra Today | Full Monarch Support Still Needs | +|---------------|----------------|-------------|----------------------------------| +| `useCurrentBlocks` / `useBlockTimestamps` / `usePositionSnapshots` / `useFreshMarketsState` / `useHistoricalSupplierPositions` | Block, snapshot, and live-state helpers used by positions/charts | Pure RPC reads via viem/wagmi | No Envio gap | +| `useUserBalancesQuery` | ERC20 wallet balances across chains | Pure RPC multicall via wagmi | No Envio gap | +| `useTokensQuery` | Token metadata lookup for app UI | Local token registry + Pendle assets API | Not part of Monarch migration | +| `useOracleMetadata` / `useAllOracleMetadata` | Oracle classification and feed metadata | Scanner gist JSON | Not part of Monarch migration | +| `useMarketMetricsQuery` | Enhanced market metrics, flows, trending, scores | Monarch metrics API via `/api/monarch/metrics` | Already Monarch-backed | +| `useUserRewardsQuery` | Claimable rewards and distributions | Morpho rewards REST + Merkl API | Outside Monarch/Envio scope today | +| `useMerklCampaignsQuery` / `useMerklHoldIncentivesQuery` | Campaign and HOLD incentive enrichment | Merkl API + hardcoded opportunity mapping | Outside Monarch/Envio scope today | ### Data Flow Patterns @@ -218,9 +276,9 @@ Split: allMarkets vs whitelistedMarkets **Position Data Flow:** ``` -1. Fetch market keys from API (which markets user has positions in) -2. Fetch on-chain snapshots per market (usePositionSnapshots) -3. Combine with market metadata +1. Discover market keys via Monarch batched `Position` reads when possible; fall back to Morpho API/Subgraph +2. Fetch on-chain snapshots per market (`usePositionSnapshots`) +3. Combine live balances with market metadata from `useProcessedMarkets` 4. Group by loan asset 5. Calculate earnings ``` diff --git a/src/abis/erc4626.ts b/src/abis/erc4626.ts index 02012a44..48cb0565 100644 --- a/src/abis/erc4626.ts +++ b/src/abis/erc4626.ts @@ -5,6 +5,13 @@ import type { Abi } from 'viem'; * Keep this small on purpose to avoid importing the giant vault ABI. */ export const erc4626Abi = [ + { + inputs: [], + name: 'decimals', + outputs: [{ internalType: 'uint8', name: '', type: 'uint8' }], + stateMutability: 'view', + type: 'function', + }, { inputs: [], name: 'asset', diff --git a/src/data-sources/monarch-api/index.ts b/src/data-sources/monarch-api/index.ts index 63e09573..7f31fe30 100644 --- a/src/data-sources/monarch-api/index.ts +++ b/src/data-sources/monarch-api/index.ts @@ -1,4 +1,10 @@ export { monarchGraphqlFetcher } from './fetchers'; +export { + fetchMonarchUserPositionMarketsForNetworks, + fetchMonarchUserPositionStateForMarket, + type MonarchUserPositionState, +} from './positions'; +export { fetchMonarchUserTransactions } from './user-transactions'; export { fetchMonarchMarketBorrowers, fetchMonarchMarketBorrows, diff --git a/src/data-sources/monarch-api/positions.ts b/src/data-sources/monarch-api/positions.ts new file mode 100644 index 00000000..1751fdf2 --- /dev/null +++ b/src/data-sources/monarch-api/positions.ts @@ -0,0 +1,113 @@ +import { envioUserPositionForMarketQuery, envioUserPositionsPageQuery } from '@/graphql/envio-queries'; +import { ALL_SUPPORTED_NETWORKS, type SupportedNetworks } from '@/utils/networks'; +import { monarchGraphqlFetcher } from './fetchers'; + +type PositionMarket = { + marketUniqueKey: string; + chainId: number; +}; + +type MonarchUserPositionRow = { + marketId: string; + chainId: number; + supplyShares: string; + borrowShares: string; + collateral: string; +}; + +type MonarchUserPositionsPageResponse = { + data?: { + Position?: MonarchUserPositionRow[]; + }; +}; + +export type MonarchUserPositionState = { + supplyShares: string; + borrowShares: string; + collateral: string; +}; + +const MONARCH_POSITION_MARKETS_PAGE_SIZE = 500; + +const isNonZero = (value: string | null | undefined): boolean => { + return value !== null && value !== undefined && value !== '0'; +}; + +export const fetchMonarchUserPositionMarketsForNetworks = async ( + userAddress: string, + networks: SupportedNetworks[], +): Promise => { + if (networks.length === 0) { + return []; + } + + const requestedNetworks = new Set(networks); + const supportedNetworks = new Set(ALL_SUPPORTED_NETWORKS); + const positionMarkets = new Map(); + let offset = 0; + + while (true) { + const response = await monarchGraphqlFetcher(envioUserPositionsPageQuery, { + user: userAddress.toLowerCase(), + chainIds: networks, + limit: MONARCH_POSITION_MARKETS_PAGE_SIZE, + offset, + }); + + const positions = response.data?.Position ?? []; + + for (const position of positions) { + const chainId = position.chainId as SupportedNetworks; + if (!supportedNetworks.has(chainId) || !requestedNetworks.has(chainId)) { + continue; + } + + if (!isNonZero(position.supplyShares) && !isNonZero(position.borrowShares) && !isNonZero(position.collateral)) { + continue; + } + + const positionMarket = { + marketUniqueKey: position.marketId, + chainId, + }; + + positionMarkets.set(`${positionMarket.marketUniqueKey.toLowerCase()}-${positionMarket.chainId}`, positionMarket); + } + + if (positions.length < MONARCH_POSITION_MARKETS_PAGE_SIZE) { + break; + } + + offset += positions.length; + } + + return Array.from(positionMarkets.values()); +}; + +export const fetchMonarchUserPositionStateForMarket = async ( + marketUniqueKey: string, + userAddress: string, + network: SupportedNetworks, +): Promise => { + const response = await monarchGraphqlFetcher(envioUserPositionForMarketQuery, { + user: userAddress.toLowerCase(), + chainId: network, + marketId: marketUniqueKey.toLowerCase(), + }); + + const position = response.data?.Position?.[0]; + + if (!position) { + return null; + } + + if (!isNonZero(position.supplyShares) && !isNonZero(position.borrowShares) && !isNonZero(position.collateral)) { + return null; + } + + return { + supplyShares: position.supplyShares, + borrowShares: position.borrowShares, + collateral: position.collateral, + }; +}; diff --git a/src/data-sources/monarch-api/transactions.ts b/src/data-sources/monarch-api/transactions.ts index ed3b7f53..6620bacf 100644 --- a/src/data-sources/monarch-api/transactions.ts +++ b/src/data-sources/monarch-api/transactions.ts @@ -1,11 +1,8 @@ /** - * Monarch API Transactions + * Monarch admin/time-range transactions. * - * Fetches Monarch supply and withdraw transactions across all chains through the - * shared Monarch GraphQL endpoint. - * - * Uses separate pagination for supplies and withdraws to ensure complete data. - * Freezes endTimestamp at fetch start to ensure consistent pagination. + * This file is intentionally scoped to the Monarch dashboard feed. + * User-history fetch/normalize logic lives in `user-transactions.ts`. */ import { monarchGraphqlFetcher } from './fetchers'; diff --git a/src/data-sources/monarch-api/user-transactions.ts b/src/data-sources/monarch-api/user-transactions.ts new file mode 100644 index 00000000..91738fab --- /dev/null +++ b/src/data-sources/monarch-api/user-transactions.ts @@ -0,0 +1,198 @@ +import { buildEnvioUserTransactionsPageQuery } from '@/graphql/envio-queries'; +import { type UserTransaction, UserTxTypes } from '@/utils/types'; +import { emptyTransactionResponse, type TransactionFilters, type TransactionResponse } from '@/utils/user-transactions'; +import { dedupeUserTransactions, sortUserTransactions } from '@/utils/user-transactions'; +import { monarchGraphqlFetcher } from './fetchers'; + +const MAX_PAGES = 50; +const MONARCH_USER_TRANSACTIONS_BATCH_SIZE = 500; +const MONARCH_USER_TRANSACTIONS_TIMEOUT_MS = 15_000; + +type MonarchUserActivityRow = { + id: string; + txHash: string; + timestamp: string | number; + market_id: string; + assets: string; + shares?: string; +}; + +type MonarchUserLiquidationRow = { + id: string; + txHash: string; + timestamp: string | number; + market_id: string; + repaidAssets: string; +}; + +type MonarchUserTransactionsPageResponse = { + data?: { + supplies?: MonarchUserActivityRow[]; + withdraws?: MonarchUserActivityRow[]; + borrows?: MonarchUserActivityRow[]; + repays?: MonarchUserActivityRow[]; + supplyCollateral?: MonarchUserActivityRow[]; + withdrawCollateral?: MonarchUserActivityRow[]; + liquidations?: MonarchUserLiquidationRow[]; + }; +}; + +const toTimestamp = (value: string | number): number => { + return typeof value === 'number' ? value : Number(value); +}; + +const mapActivityRows = (rows: MonarchUserActivityRow[] | undefined, type: UserTxTypes, sharesFallback = '0'): UserTransaction[] => { + return (rows ?? []).map((row) => ({ + id: row.id, + hash: row.txHash, + timestamp: toTimestamp(row.timestamp), + type, + data: { + __typename: type, + shares: row.shares ?? sharesFallback, + assets: row.assets, + market: { + uniqueKey: row.market_id, + }, + }, + })); +}; + +const mapLiquidationRows = (rows: MonarchUserLiquidationRow[] | undefined): UserTransaction[] => { + return (rows ?? []).map((row) => ({ + id: row.id, + hash: row.txHash, + timestamp: toTimestamp(row.timestamp), + type: UserTxTypes.MarketLiquidation, + data: { + __typename: UserTxTypes.MarketLiquidation, + shares: '0', + assets: row.repaidAssets, + market: { + uniqueKey: row.market_id, + }, + }, + })); +}; + +const shouldContinuePaging = (response: MonarchUserTransactionsPageResponse, limit: number): boolean => { + const data = response.data; + if (!data) { + return false; + } + + return [data.supplies, data.withdraws, data.borrows, data.repays, data.supplyCollateral, data.withdrawCollateral, data.liquidations].some( + (rows) => (rows?.length ?? 0) >= limit, + ); +}; + +const fetchMonarchUserTransactionsPage = async ( + query: string, + variables: Record, +): Promise => { + const controller = new AbortController(); + const timeoutId = globalThis.setTimeout(() => { + controller.abort(); + }, MONARCH_USER_TRANSACTIONS_TIMEOUT_MS); + + try { + return await monarchGraphqlFetcher(query, variables, { + signal: controller.signal, + }); + } catch (error) { + if (error instanceof Error && error.name === 'AbortError') { + throw new Error(`Monarch user transaction request timed out after ${MONARCH_USER_TRANSACTIONS_TIMEOUT_MS}ms`); + } + + throw error; + } finally { + globalThis.clearTimeout(timeoutId); + } +}; + +const getUserAddressVariants = (userAddresses: string[]): string[] => { + const variants = new Set(); + + for (const address of userAddresses) { + if (!address) { + continue; + } + + variants.add(address); + variants.add(address.toLowerCase()); + } + + return Array.from(variants); +}; + +export const fetchMonarchUserTransactions = async (filters: TransactionFilters): Promise => { + const effectiveTimestampLte = filters.timestampLte ?? Math.floor(Date.now() / 1000); + const query = buildEnvioUserTransactionsPageQuery({ + useHashFilter: Boolean(filters.hash), + useMarketFilter: Boolean(filters.marketUniqueKeys?.length), + useTimestampGte: filters.timestampGte !== undefined && filters.timestampGte !== null, + useTimestampLte: true, + }); + const variables: Record = { + chainId: filters.chainId, + userAddresses: getUserAddressVariants(filters.userAddress), + limit: MONARCH_USER_TRANSACTIONS_BATCH_SIZE, + offset: 0, + timestampLte: effectiveTimestampLte, + }; + + if (filters.marketUniqueKeys?.length) { + variables.marketIds = filters.marketUniqueKeys; + } + + if (filters.timestampGte !== undefined && filters.timestampGte !== null) { + variables.timestampGte = filters.timestampGte; + } + + if (filters.hash) { + variables.hash = filters.hash; + } + + const allTransactions: UserTransaction[] = []; + + for (let page = 0; page < MAX_PAGES; page++) { + variables.offset = page * MONARCH_USER_TRANSACTIONS_BATCH_SIZE; + + const response = await fetchMonarchUserTransactionsPage(query, variables); + const data = response.data; + + allTransactions.push( + ...mapActivityRows(data?.supplies, UserTxTypes.MarketSupply), + ...mapActivityRows(data?.withdraws, UserTxTypes.MarketWithdraw), + ...mapActivityRows(data?.borrows, UserTxTypes.MarketBorrow), + ...mapActivityRows(data?.repays, UserTxTypes.MarketRepay), + ...mapActivityRows(data?.supplyCollateral, UserTxTypes.MarketSupplyCollateral), + ...mapActivityRows(data?.withdrawCollateral, UserTxTypes.MarketWithdrawCollateral), + ...mapLiquidationRows(data?.liquidations), + ); + + const hasNextPage = shouldContinuePaging(response, MONARCH_USER_TRANSACTIONS_BATCH_SIZE); + if (!hasNextPage) { + break; + } + + if (page === MAX_PAGES - 1) { + return emptyTransactionResponse('Monarch user transaction history exceeded the safe pagination limit'); + } + } + + const dedupedTransactions = sortUserTransactions(dedupeUserTransactions(allTransactions)); + + const skip = filters.skip ?? 0; + const first = filters.first ?? dedupedTransactions.length; + const items = dedupedTransactions.slice(skip, skip + first); + + return { + items, + pageInfo: { + count: items.length, + countTotal: dedupedTransactions.length, + }, + error: null, + }; +}; diff --git a/src/data-sources/morpho-api/transactions.ts b/src/data-sources/morpho-api/transactions.ts index ef54247e..000c4fd3 100644 --- a/src/data-sources/morpho-api/transactions.ts +++ b/src/data-sources/morpho-api/transactions.ts @@ -1,5 +1,5 @@ import { userTransactionsQuery } from '@/graphql/morpho-api-queries'; -import type { TransactionFilters, TransactionResponse } from '@/hooks/queries/fetchUserTransactions'; +import type { TransactionFilters, TransactionResponse } from '@/utils/user-transactions'; import { morphoGraphqlFetcher } from './fetchers'; // Define the expected shape of the GraphQL response for transactions diff --git a/src/data-sources/subgraph/transactions.ts b/src/data-sources/subgraph/transactions.ts index 5091ed43..6b92098f 100644 --- a/src/data-sources/subgraph/transactions.ts +++ b/src/data-sources/subgraph/transactions.ts @@ -1,8 +1,9 @@ import { getSubgraphUserTransactionsQuery } from '@/graphql/morpho-subgraph-queries'; -import type { TransactionFilters, TransactionResponse } from '@/hooks/queries/fetchUserTransactions'; +import type { TransactionFilters, TransactionResponse } from '@/utils/user-transactions'; import type { SupportedNetworks } from '@/utils/networks'; import { getSubgraphUrl } from '@/utils/subgraph-urls'; import { type UserTransaction, UserTxTypes } from '@/utils/types'; +import { sortUserTransactions } from '@/utils/user-transactions'; import type { SubgraphAccountData, SubgraphBorrowTx, @@ -31,6 +32,7 @@ const transformSubgraphTransactions = ( subgraphData.deposits.forEach((tx: SubgraphDepositTx) => { const type = tx.isCollateral ? UserTxTypes.MarketSupplyCollateral : UserTxTypes.MarketSupply; allTransactions.push({ + id: tx.id, hash: tx.hash, timestamp: Number.parseInt(tx.timestamp, 10), type: type, @@ -48,6 +50,7 @@ const transformSubgraphTransactions = ( subgraphData.withdraws.forEach((tx: SubgraphWithdrawTx) => { const type = tx.isCollateral ? UserTxTypes.MarketWithdrawCollateral : UserTxTypes.MarketWithdraw; allTransactions.push({ + id: tx.id, hash: tx.hash, timestamp: Number.parseInt(tx.timestamp, 10), type: type, @@ -64,6 +67,7 @@ const transformSubgraphTransactions = ( subgraphData.borrows.forEach((tx: SubgraphBorrowTx) => { allTransactions.push({ + id: tx.id, hash: tx.hash, timestamp: Number.parseInt(tx.timestamp, 10), type: UserTxTypes.MarketBorrow, @@ -80,6 +84,7 @@ const transformSubgraphTransactions = ( subgraphData.repays.forEach((tx: SubgraphRepayTx) => { allTransactions.push({ + id: tx.id, hash: tx.hash, timestamp: Number.parseInt(tx.timestamp, 10), type: UserTxTypes.MarketRepay, @@ -96,6 +101,7 @@ const transformSubgraphTransactions = ( subgraphData.liquidations.forEach((tx: SubgraphLiquidationTx) => { allTransactions.push({ + id: tx.id, hash: tx.hash, timestamp: Number.parseInt(tx.timestamp, 10), type: UserTxTypes.MarketLiquidation, @@ -110,7 +116,7 @@ const transformSubgraphTransactions = ( }); }); - allTransactions.sort((a, b) => b.timestamp - a.timestamp); + sortUserTransactions(allTransactions); // No client-side filtering needed - filtering is done at GraphQL level via market_in const count = allTransactions.length; diff --git a/src/features/market-detail/components/charts/borrowers-pie-chart.tsx b/src/features/market-detail/components/charts/borrowers-pie-chart.tsx index 6ff64c53..06a90730 100644 --- a/src/features/market-detail/components/charts/borrowers-pie-chart.tsx +++ b/src/features/market-detail/components/charts/borrowers-pie-chart.tsx @@ -97,7 +97,7 @@ function BorrowersPieTooltip({ } export function BorrowersPieChart({ chainId, market, oraclePrice }: BorrowersPieChartProps) { - const { data: borrowers, isLoading, totalCount } = useAllMarketBorrowers(market.uniqueKey, chainId); + const { data: borrowers, isLoading, totalCount } = useAllMarketBorrowers(market.uniqueKey, chainId, market.state); const { getVaultByAddress } = useVaultRegistry(); const [expandedOther, setExpandedOther] = useState(false); const chartColors = useChartColors(); diff --git a/src/features/market-detail/components/charts/debt-at-risk-chart.tsx b/src/features/market-detail/components/charts/debt-at-risk-chart.tsx index fc3b091a..1c34fbfc 100644 --- a/src/features/market-detail/components/charts/debt-at-risk-chart.tsx +++ b/src/features/market-detail/components/charts/debt-at-risk-chart.tsx @@ -57,7 +57,7 @@ function DebtAtRiskTooltip({ } export function DebtAtRiskChart({ chainId, market, oraclePrice }: DebtAtRiskChartProps) { - const { data: borrowers, isLoading } = useAllMarketBorrowers(market.uniqueKey, chainId); + const { data: borrowers, isLoading } = useAllMarketBorrowers(market.uniqueKey, chainId, market.state); const chartColors = useChartColors(); const lltv = useMemo(() => { diff --git a/src/features/market-detail/market-view.tsx b/src/features/market-detail/market-view.tsx index b6b6432f..4759b3e0 100644 --- a/src/features/market-detail/market-view.tsx +++ b/src/features/market-detail/market-view.tsx @@ -92,7 +92,7 @@ function MarketContent() { data: borrowersData, isLoading: borrowersLoading, totalCount: borrowersTotalCount, - } = useAllMarketBorrowers(market?.uniqueKey, network); + } = useAllMarketBorrowers(market?.uniqueKey, network, market?.state); const { data: suppliersData, isLoading: suppliersLoading, diff --git a/src/features/position-detail/components/history-tab.tsx b/src/features/position-detail/components/history-tab.tsx index 589f74cf..196eb3bc 100644 --- a/src/features/position-detail/components/history-tab.tsx +++ b/src/features/position-detail/components/history-tab.tsx @@ -1,6 +1,6 @@ 'use client'; -import { useMemo, useState } from 'react'; +import { useEffect, useMemo, useState } from 'react'; import { now, getLocalTimeZone, type ZonedDateTime } from '@internationalized/date'; import moment from 'moment'; import { formatUnits } from 'viem'; @@ -20,7 +20,6 @@ import { TableContainerWithHeader } from '@/components/common/table-container-wi import { Modal, ModalHeader, ModalBody, ModalFooter } from '@/components/common/Modal'; import { MarketIdentity, MarketIdentityFocus, MarketIdentityMode } from '@/features/markets/components/market-identity'; import { UserPositionsChart } from '@/features/positions/components/user-positions-chart'; -import { useProcessedMarkets } from '@/hooks/useProcessedMarkets'; import { useUserTransactionsQuery } from '@/hooks/queries/useUserTransactionsQuery'; import { useDisclosure } from '@/hooks/useDisclosure'; import { useStyledToast } from '@/hooks/useStyledToast'; @@ -33,6 +32,8 @@ import type { SupportedNetworks } from '@/utils/networks'; const PAGE_SIZE = 10; +const getChainScopedMarketKey = (chainId: number, uniqueKey: string): string => `${chainId}:${uniqueKey.toLowerCase()}`; + type HistoryTabProps = { groupedPosition: GroupedPosition; chainId: SupportedNetworks; @@ -43,7 +44,6 @@ type HistoryTabProps = { }; export function HistoryTab({ groupedPosition, chainId, userAddress, transactions, snapshotsByChain, actualBlockData }: HistoryTabProps) { - const { allMarkets, loading: loadingMarkets } = useProcessedMarkets(); const toast = useStyledToast(); const [startDate, setStartDate] = useState(null); @@ -65,23 +65,43 @@ export function HistoryTab({ groupedPosition, chainId, userAddress, transactions } = useUserTransactionsQuery({ filters: { userAddress: [userAddress], - first: PAGE_SIZE, - skip: (currentPage - 1) * PAGE_SIZE, marketUniqueKeys: marketIdFilter, - chainId: chainId, + chainId, timestampGte: startDate ? Math.floor(startDate.toDate().getTime() / 1000) : undefined, timestampLte: endDate ? Math.floor(endDate.toDate().getTime() / 1000) : undefined, }, - enabled: allMarkets.length > 0, + paginate: true, + enabled: marketIdFilter.length > 0, }); - const loading = loadingHistory || loadingMarkets; - const history = data?.items ?? []; - const totalPages = data ? Math.ceil(data.pageInfo.countTotal / PAGE_SIZE) : 0; - const totalEntries = data?.pageInfo.countTotal ?? 0; + const loading = loadingHistory; + const marketMap = useMemo(() => { + const nextMap = new Map(); + + for (const position of groupedPosition.markets) { + nextMap.set(getChainScopedMarketKey(position.market.morphoBlue.chain.id, position.market.uniqueKey), position.market); + } + + return nextMap; + }, [groupedPosition.markets]); + const filteredHistory = data?.items ?? []; + const totalEntries = filteredHistory.length; + const totalPages = Math.ceil(totalEntries / PAGE_SIZE); + const history = useMemo(() => { + const safeTotalPages = Math.max(1, totalPages); + const safePage = Math.max(1, Math.min(currentPage, safeTotalPages)); + const startIndex = (safePage - 1) * PAGE_SIZE; + return filteredHistory.slice(startIndex, startIndex + PAGE_SIZE); + }, [currentPage, filteredHistory, totalPages]); const maxDate = useMemo(() => now(getLocalTimeZone()), []); + useEffect(() => { + if (totalPages > 0 && currentPage > totalPages) { + setCurrentPage(totalPages); + } + }, [currentPage, totalPages]); + const handleStartDateChange = (date: ZonedDateTime) => { if (endDate && date > endDate) setEndDate(date); setStartDate(date); @@ -296,7 +316,7 @@ export function HistoryTab({ groupedPosition, chainId, userAddress, transactions history.map((tx, index) => { if (!tx.data.market) return null; - const market = allMarkets.find((m) => m.uniqueKey === tx.data.market.uniqueKey) as Market | undefined; + const market = marketMap.get(getChainScopedMarketKey(chainId, tx.data.market.uniqueKey)); if (!market) return null; const isSupply = tx.type === UserTxTypes.MarketSupply; diff --git a/src/features/position-detail/hooks/usePositionDetailData.ts b/src/features/position-detail/hooks/usePositionDetailData.ts index 9439bdac..dd4f9215 100644 --- a/src/features/position-detail/hooks/usePositionDetailData.ts +++ b/src/features/position-detail/hooks/usePositionDetailData.ts @@ -37,9 +37,9 @@ export function usePositionDetailData({ // Group all positions across all chains const allPositions = useMemo(() => { if (!positions) return []; - const grouped = groupPositionsByLoanAsset(positions); + const grouped = groupPositionsByLoanAsset(positions, actualBlockData); return processCollaterals(grouped); - }, [positions]); + }, [positions, actualBlockData]); // Find current position from the all-chains result const currentPosition = useMemo(() => { diff --git a/src/features/positions/components/supplied-morpho-blue-grouped-table.tsx b/src/features/positions/components/supplied-morpho-blue-grouped-table.tsx index 859ddda1..7b90ec80 100644 --- a/src/features/positions/components/supplied-morpho-blue-grouped-table.tsx +++ b/src/features/positions/components/supplied-morpho-blue-grouped-table.tsx @@ -81,7 +81,7 @@ export function SuppliedMorphoBlueGroupedTable({ all: 'All', }; - const groupedPositions = useMemo(() => groupPositionsByLoanAsset(positions), [positions]); + const groupedPositions = useMemo(() => groupPositionsByLoanAsset(positions, actualBlockData), [positions, actualBlockData]); const isOwner = useMemo(() => !!account && !!address && account.toLowerCase() === address.toLowerCase(), [account, address]); const processedPositions = useMemo(() => processCollaterals(groupedPositions), [groupedPositions]); diff --git a/src/graphql/envio-queries.ts b/src/graphql/envio-queries.ts index 31eb7ebb..c4d94d7c 100644 --- a/src/graphql/envio-queries.ts +++ b/src/graphql/envio-queries.ts @@ -35,6 +35,196 @@ export const envioBorrowersPageQuery = ` } `; +export const envioUserPositionsPageQuery = ` + query EnvioUserPositionsPage($user: String!, $chainIds: [Int!], $limit: Int!, $offset: Int!) { + Position( + where: { + user: { _eq: $user } + chainId: { _in: $chainIds } + } + limit: $limit + offset: $offset + order_by: [{ chainId: asc }, { marketId: asc }] + ) { + marketId + chainId + supplyShares + borrowShares + collateral + } + } +`; + +export const envioUserPositionForMarketQuery = ` + query EnvioUserPositionForMarket($user: String!, $chainId: Int!, $marketId: String!) { + Position( + where: { + user: { _eq: $user } + chainId: { _eq: $chainId } + marketId: { _eq: $marketId } + } + limit: 1 + ) { + marketId + chainId + supplyShares + borrowShares + collateral + } + } +`; + +export const buildEnvioUserTransactionsPageQuery = ({ + useHashFilter, + useMarketFilter, + useTimestampGte, + useTimestampLte, +}: { + useHashFilter: boolean; + useMarketFilter: boolean; + useTimestampGte: boolean; + useTimestampLte: boolean; +}): string => { + const variableDeclarations = ['$chainId: Int!', '$userAddresses: [String!]!', '$limit: Int!', '$offset: Int!']; + + if (useMarketFilter) { + variableDeclarations.push('$marketIds: [String!]!'); + } + + if (useTimestampGte) { + variableDeclarations.push('$timestampGte: numeric!'); + } + + if (useTimestampLte) { + variableDeclarations.push('$timestampLte: numeric!'); + } + + if (useHashFilter) { + variableDeclarations.push('$hash: String!'); + } + + const buildWhere = (userField: 'onBehalf' | 'borrower'): string => { + const whereClauses = ['chainId: { _eq: $chainId }', `${userField}: { _in: $userAddresses }`]; + + if (useMarketFilter) { + whereClauses.push('market_id: { _in: $marketIds }'); + } + + if (useTimestampGte || useTimestampLte) { + const timestampFilters: string[] = []; + + if (useTimestampGte) { + timestampFilters.push('_gte: $timestampGte'); + } + + if (useTimestampLte) { + timestampFilters.push('_lte: $timestampLte'); + } + + whereClauses.push(`timestamp: { ${timestampFilters.join(' ')} }`); + } + + if (useHashFilter) { + whereClauses.push('txHash: { _eq: $hash }'); + } + + return whereClauses.join(' '); + }; + + return ` + query EnvioUserTransactionsPage(${variableDeclarations.join(', ')}) { + supplies: Morpho_Supply( + where: { ${buildWhere('onBehalf')} } + limit: $limit + offset: $offset + order_by: [{ timestamp: desc }, { txHash: desc }, { id: desc }] + ) { + id + txHash + timestamp + market_id + assets + shares + } + withdraws: Morpho_Withdraw( + where: { ${buildWhere('onBehalf')} } + limit: $limit + offset: $offset + order_by: [{ timestamp: desc }, { txHash: desc }, { id: desc }] + ) { + id + txHash + timestamp + market_id + assets + shares + } + borrows: Morpho_Borrow( + where: { ${buildWhere('onBehalf')} } + limit: $limit + offset: $offset + order_by: [{ timestamp: desc }, { txHash: desc }, { id: desc }] + ) { + id + txHash + timestamp + market_id + assets + shares + } + repays: Morpho_Repay( + where: { ${buildWhere('onBehalf')} } + limit: $limit + offset: $offset + order_by: [{ timestamp: desc }, { txHash: desc }, { id: desc }] + ) { + id + txHash + timestamp + market_id + assets + shares + } + supplyCollateral: Morpho_SupplyCollateral( + where: { ${buildWhere('onBehalf')} } + limit: $limit + offset: $offset + order_by: [{ timestamp: desc }, { txHash: desc }, { id: desc }] + ) { + id + txHash + timestamp + market_id + assets + } + withdrawCollateral: Morpho_WithdrawCollateral( + where: { ${buildWhere('onBehalf')} } + limit: $limit + offset: $offset + order_by: [{ timestamp: desc }, { txHash: desc }, { id: desc }] + ) { + id + txHash + timestamp + market_id + assets + } + liquidations: Morpho_Liquidate( + where: { ${buildWhere('borrower')} } + limit: $limit + offset: $offset + order_by: [{ timestamp: desc }, { txHash: desc }, { id: desc }] + ) { + id + txHash + timestamp + market_id + repaidAssets + } + } + `; +}; + export const envioSupplyWithdrawPageQuery = ` query EnvioSupplyWithdrawPage($chainId: Int!, $marketId: String!, $minAssets: numeric!, $limit: Int!, $offset: Int!) { supplies: Morpho_Supply( diff --git a/src/hooks/queries/fetchUserTransactions.ts b/src/hooks/queries/fetchUserTransactions.ts index d506a9d5..45a19334 100644 --- a/src/hooks/queries/fetchUserTransactions.ts +++ b/src/hooks/queries/fetchUserTransactions.ts @@ -1,36 +1,88 @@ import { supportsMorphoApi } from '@/config/dataSources'; +import { fetchMonarchUserTransactions } from '@/data-sources/monarch-api/user-transactions'; import { fetchMorphoTransactions } from '@/data-sources/morpho-api/transactions'; import { fetchSubgraphTransactions } from '@/data-sources/subgraph/transactions'; import { isSupportedChain } from '@/utils/networks'; import type { UserTransaction } from '@/utils/types'; +import { + compareUserTransactions, + emptyTransactionResponse, + type TransactionFilters, + type TransactionResponse, +} from '@/utils/user-transactions'; -/** - * Filters for fetching user transactions. - * Requires a single chainId - for multi-chain queries, use useUserTransactionsQuery with paginate: true. - */ -export type TransactionFilters = { - userAddress: string[]; - chainId: number; - marketUniqueKeys?: string[]; - timestampGte?: number; - timestampLte?: number; - skip?: number; - first?: number; - hash?: string; - assetIds?: string[]; +const MAX_FALLBACK_PAGES = 50; +const FALLBACK_PARALLEL_PAGE_BATCH_SIZE = 5; + +type FallbackTransactionsSource = 'morpho' | 'subgraph'; + +const canUseMonarchTransactions = (filters: TransactionFilters): boolean => { + return !filters.assetIds?.length; }; -export type TransactionResponse = { - items: UserTransaction[]; - pageInfo: { - count: number; - countTotal: number; +const fetchFallbackTransactionsFromSource = async ( + source: FallbackTransactionsSource, + filters: TransactionFilters, +): Promise => { + const { chainId } = filters; + + if (source === 'morpho') { + try { + return await fetchMorphoTransactions(filters); + } catch (morphoError) { + const errorMsg = `Failed to fetch transactions from Morpho API: ${(morphoError as Error)?.message ?? 'Unknown error'}`; + console.warn(`Morpho API failed for chain ${chainId}:`, morphoError); + return emptyTransactionResponse(errorMsg); + } + } + + try { + return await fetchSubgraphTransactions(filters, chainId); + } catch (subgraphError) { + const errorMsg = `Failed to fetch transactions: ${(subgraphError as Error)?.message ?? 'Unknown error'}`; + console.error(errorMsg); + return emptyTransactionResponse(errorMsg); + } +}; + +const selectFallbackUserTransactionsSource = async ( + filters: TransactionFilters, +): Promise<{ source: FallbackTransactionsSource | null; response: TransactionResponse }> => { + const { chainId } = filters; + + if (supportsMorphoApi(chainId)) { + const morphoResponse = await fetchFallbackTransactionsFromSource('morpho', filters); + if (!morphoResponse.error) { + return { + source: 'morpho', + response: morphoResponse, + }; + } + } + + if (filters.userAddress.length !== 1) { + const errorMsg = 'Subgraph data source requires exactly one user address.'; + console.error(errorMsg); + return { + source: null, + response: emptyTransactionResponse(errorMsg), + }; + } + + const subgraphResponse = await fetchFallbackTransactionsFromSource('subgraph', filters); + return { + source: subgraphResponse.error ? null : 'subgraph', + response: subgraphResponse, }; - error: string | null; +}; + +const fetchFallbackUserTransactions = async (filters: TransactionFilters): Promise => { + const { response } = await selectFallbackUserTransactionsSource(filters); + return response; }; /** - * Fetches user transactions for a SINGLE chain from Morpho API or Subgraph. + * Fetches user transactions for a SINGLE chain from Monarch, Morpho API, or Subgraph. * For multi-chain queries, use useUserTransactionsQuery with paginate: true. * * @param filters - Transaction filters (chainId is required) @@ -39,51 +91,143 @@ export type TransactionResponse = { export async function fetchUserTransactions(filters: TransactionFilters): Promise { const { chainId } = filters; - // Validate chainId + if (filters.userAddress.length === 0) { + return emptyTransactionResponse(); + } + if (!isSupportedChain(chainId)) { console.warn(`Unsupported chain: ${chainId}`); - return { - items: [], - pageInfo: { count: 0, countTotal: 0 }, - error: `Unsupported chain: ${chainId}`, - }; + return emptyTransactionResponse(`Unsupported chain: ${chainId}`); } - // Check subgraph user address limitation - if (!supportsMorphoApi(chainId) && filters.userAddress.length !== 1) { - const errorMsg = 'Subgraph data source requires exactly one user address.'; - console.error(errorMsg); - return { - items: [], - pageInfo: { count: 0, countTotal: 0 }, - error: errorMsg, - }; + if (canUseMonarchTransactions(filters)) { + try { + const response = await fetchMonarchUserTransactions(filters); + if (!response.error) { + return response; + } + } catch (monarchError) { + console.warn(`Monarch API failed for chain ${chainId}, falling back to Morpho/Subgraph:`, monarchError); + } } - // Try Morpho API first if supported - if (supportsMorphoApi(chainId)) { + return fetchFallbackUserTransactions(filters); +} + +export async function fetchAllUserTransactions(filters: TransactionFilters, pageSize = 1000): Promise { + const frozenFilters: TransactionFilters = { + ...filters, + timestampLte: filters.timestampLte ?? Math.floor(Date.now() / 1000), + }; + const { chainId } = filters; + + if (frozenFilters.userAddress.length === 0) { + return emptyTransactionResponse(); + } + + if (!isSupportedChain(chainId)) { + console.warn(`Unsupported chain: ${chainId}`); + return emptyTransactionResponse(`Unsupported chain: ${chainId}`); + } + + if (canUseMonarchTransactions(frozenFilters)) { try { - const response = await fetchMorphoTransactions(filters); + const response = await fetchMonarchUserTransactions({ + ...frozenFilters, + first: undefined, + skip: 0, + }); if (!response.error) { return response; } - // Morpho API returned an error, fall through to Subgraph - } catch (morphoError) { - console.warn(`Morpho API failed for chain ${chainId}, falling back to Subgraph:`, morphoError); - // Fall through to Subgraph + } catch (monarchError) { + console.warn(`Monarch API failed for chain ${chainId}, falling back to Morpho/Subgraph:`, monarchError); } } - // Fallback to Subgraph - try { - return await fetchSubgraphTransactions(filters, chainId); - } catch (subgraphError) { - const errorMsg = `Failed to fetch transactions: ${(subgraphError as Error)?.message ?? 'Unknown error'}`; - console.error(errorMsg); + const allItems: UserTransaction[] = []; + const firstPageFilters: TransactionFilters = { + ...frozenFilters, + first: pageSize, + skip: 0, + }; + const { source, response: firstPage } = await selectFallbackUserTransactionsSource(firstPageFilters); + + if (!source || (firstPage.error && firstPage.items.length === 0)) { + return firstPage; + } + + allItems.push(...firstPage.items); + + if (source === 'morpho') { + const totalPages = Math.ceil(firstPage.pageInfo.countTotal / pageSize); + if (totalPages > MAX_FALLBACK_PAGES) { + return emptyTransactionResponse(`Fallback transaction history exceeded the safe pagination limit (${totalPages} pages)`); + } + + for (let startPage = 1; startPage < totalPages; startPage += FALLBACK_PARALLEL_PAGE_BATCH_SIZE) { + const endPage = Math.min(startPage + FALLBACK_PARALLEL_PAGE_BATCH_SIZE, totalPages); + const batchPages = Array.from({ length: endPage - startPage }, (_, index) => startPage + index); + const batchResponses = await Promise.all( + batchPages.map((page) => + fetchFallbackTransactionsFromSource(source, { + ...frozenFilters, + first: pageSize, + skip: page * pageSize, + }), + ), + ); + + const failedResponse = batchResponses.find((response) => response.error); + if (failedResponse) { + return emptyTransactionResponse(failedResponse.error); + } + + allItems.push(...batchResponses.flatMap((response) => response.items)); + } + + allItems.sort(compareUserTransactions); + return { - items: [], - pageInfo: { count: 0, countTotal: 0 }, - error: errorMsg, + items: allItems, + pageInfo: { + count: allItems.length, + countTotal: firstPage.pageInfo.countTotal, + }, + error: null, }; } + + for (let page = 1; page < MAX_FALLBACK_PAGES; page++) { + const response = await fetchFallbackTransactionsFromSource(source, { + ...frozenFilters, + first: pageSize, + skip: page * pageSize, + }); + + if (response.error) { + return emptyTransactionResponse(response.error); + } + + allItems.push(...response.items); + + if (response.items.length < pageSize) { + break; + } + + if (page === MAX_FALLBACK_PAGES - 1) { + return emptyTransactionResponse('Fallback transaction history exceeded the safe pagination limit'); + } + } + + allItems.sort(compareUserTransactions); + + return { + items: allItems, + pageInfo: { + count: allItems.length, + countTotal: source === 'subgraph' ? allItems.length : firstPage.pageInfo.countTotal, + }, + error: null, + }; } diff --git a/src/hooks/queries/useUserTransactionsQuery.ts b/src/hooks/queries/useUserTransactionsQuery.ts index 19e3bbdc..e1bb366d 100644 --- a/src/hooks/queries/useUserTransactionsQuery.ts +++ b/src/hooks/queries/useUserTransactionsQuery.ts @@ -1,7 +1,7 @@ import { useQuery } from '@tanstack/react-query'; -import { fetchUserTransactions, type TransactionFilters, type TransactionResponse } from './fetchUserTransactions'; +import { fetchAllUserTransactions, fetchUserTransactions } from './fetchUserTransactions'; import { ALL_SUPPORTED_NETWORKS } from '@/utils/networks'; -import type { UserTransaction } from '@/utils/types'; +import { compareUserTransactions, type TransactionFilters, type TransactionResponse } from '@/utils/user-transactions'; /** * Filter options for the hook. @@ -27,13 +27,13 @@ type UseUserTransactionsQueryOptions = { }; /** - * Fetches user transactions from Morpho API or Subgraph using React Query. + * Fetches user transactions from Monarch, Morpho API, or Subgraph using React Query. * * Data fetching strategy: * - For non-paginated queries: requires single chainId, fetches with skip/first * - For paginated queries: can use multiple chainIds, fetches ALL data in parallel - * - Tries Morpho API first (if supported for the network) - * - Falls back to Subgraph if API fails or not supported + * - Tries Monarch first when the requested filters are supported by Envio + * - Falls back to Morpho API and then Subgraph when Monarch is empty, unsupported, or fails */ export const useUserTransactionsQuery = (options: UseUserTransactionsQueryOptions) => { const { filters, enabled = true, paginate = false, pageSize = 1000 } = options; @@ -56,45 +56,21 @@ export const useUserTransactionsQuery = (options: UseUserTransactionsQueryOption ], queryFn: async () => { if (paginate) { - // Paginate mode: fetch ALL transactions, supports multi-chain const chainIds = filters.chainIds ?? (filters.chainId ? [filters.chainId] : ALL_SUPPORTED_NETWORKS); - - // Helper to fetch all pages for one chain - const fetchAllForChain = async (chainId: number): Promise => { - const items: UserTransaction[] = []; - let skip = 0; - let hasMore = true; - - while (hasMore) { - const response = await fetchUserTransactions({ - ...filters, - chainId, - first: pageSize, - skip, - }); - - items.push(...response.items); - skip += response.items.length; - - // Stop if we got fewer items than requested (last page) - hasMore = response.items.length >= pageSize; - - // Safety: max 50 pages per chain to prevent infinite loops - if (skip >= 50 * pageSize) { - console.warn(`Transaction pagination limit reached for chain ${chainId} (50 pages)`); - break; - } - } - - return items; - }; - - // Fetch ALL chains IN PARALLEL - const results = await Promise.all(chainIds.map(fetchAllForChain)); - const allItems = results.flat(); - - // Sort combined results by timestamp (descending) - allItems.sort((a, b) => b.timestamp - a.timestamp); + const results = await Promise.all( + chainIds.map((chainId) => + fetchAllUserTransactions( + { + ...filters, + chainId, + }, + pageSize, + ), + ), + ); + const allItems = results.flatMap((result) => result.items); + allItems.sort(compareUserTransactions); + const error = results.find((result) => result.error)?.error ?? null; return { items: allItems, @@ -102,7 +78,7 @@ export const useUserTransactionsQuery = (options: UseUserTransactionsQueryOption count: allItems.length, countTotal: allItems.length, }, - error: null, + error, }; } @@ -112,7 +88,7 @@ export const useUserTransactionsQuery = (options: UseUserTransactionsQueryOption } // Simple case: fetch once with limit - return await fetchUserTransactions({ + return fetchUserTransactions({ ...filters, chainId: filters.chainId, first: filters.first ?? pageSize, diff --git a/src/hooks/queries/useUserVaultsV2Query.ts b/src/hooks/queries/useUserVaultsV2Query.ts index a8ab5dee..90255805 100644 --- a/src/hooks/queries/useUserVaultsV2Query.ts +++ b/src/hooks/queries/useUserVaultsV2Query.ts @@ -1,9 +1,9 @@ import { useQuery } from '@tanstack/react-query'; import type { Address } from 'viem'; import { useConnection } from 'wagmi'; -import { fetchMorphoVaultApys } from '@/data-sources/morpho-api/vaults'; import { fetchUserVaultV2DetailsAllNetworks, type UserVaultV2 } from '@/data-sources/monarch-api/vaults'; import { fetchUserVaultShares, fetchVaultTotalAssets, getVaultReadKey } from '@/utils/vaultAllocation'; +import { fetchVaultYieldSnapshots, type VaultYieldSnapshot } from '@/utils/vaultYield'; type UseUserVaultsV2Options = { includeApy?: boolean; @@ -34,15 +34,15 @@ async function fetchAndProcessVaults({ return []; } - const [avgApyByVault, shareBalances, totalAssetsByVault] = await Promise.all([ + const [yieldSnapshotsByVault, shareBalances, totalAssetsByVault] = await Promise.all([ includeApy - ? fetchMorphoVaultApys( - validVaults.map((vault) => ({ - address: vault.address, + ? fetchVaultYieldSnapshots({ + vaults: validVaults.map((vault) => ({ + address: vault.address as Address, networkId: vault.networkId, })), - ) - : Promise.resolve(new Map()), + }) + : Promise.resolve(new Map()), includeBalances ? fetchUserVaultShares( validVaults.map((v) => ({ address: v.address as Address, networkId: v.networkId })), @@ -61,7 +61,7 @@ async function fetchAndProcessVaults({ return { ...vault, adapter: vault.adapters[0] as Address | undefined, - avgApy: avgApyByVault.get(vaultKey), + avgApy: yieldSnapshotsByVault.get(vaultKey)?.vaultApy ?? undefined, balance: includeBalances ? (shareBalances.has(vaultKey) ? shareBalances.get(vaultKey) : undefined) : undefined, totalAssets: totalAssetsByVault.get(vaultKey), }; @@ -73,7 +73,7 @@ async function fetchAndProcessVaults({ * * Data fetching strategy: * - Fetches cross-chain vault details from Monarch API - * - Optionally enriches current APY from batched Morpho API vault rates + * - Optionally enriches current APY from batched RPC share-price yield snapshots * - Optionally enriches user's share balances via multicall * - Optionally enriches vault total assets via multicall * - Returns complete vault data with optional on-chain enrichments diff --git a/src/hooks/use4626VaultAPR.ts b/src/hooks/use4626VaultAPR.ts index 2b2cc75a..5e7f11ac 100644 --- a/src/hooks/use4626VaultAPR.ts +++ b/src/hooks/use4626VaultAPR.ts @@ -1,19 +1,18 @@ import { useMemo } from 'react'; import { useQuery } from '@tanstack/react-query'; import type { Address, Hex } from 'viem'; -import { erc4626Abi } from '@/abis/erc4626'; import morphoAbi from '@/abis/morpho'; import { useCustomRpcContext } from '@/components/providers/CustomRpcProvider'; import { computeAnnualizedApyFromGrowth, computeExpectedNetCarryApy } from '@/hooks/leverage/math'; -import { estimateBlockAtTimestamp } from '@/utils/blockEstimation'; import { getMorphoAddress } from '@/utils/morpho'; import type { SupportedNetworks } from '@/utils/networks'; import { getClient } from '@/utils/rpc'; import type { Market } from '@/utils/types'; +import { fetchVaultYieldSnapshots } from '@/utils/vaultYield'; +import { getVaultReadKey } from '@/utils/vaultAllocation'; const DEFAULT_LOOKBACK_DAYS = 3; const BORROW_INDEX_SCALE = 10n ** 18n; -const SECONDS_PER_DAY = 24 * 60 * 60; type Use4626VaultAPRParams = { market: Market; @@ -83,91 +82,76 @@ export function use4626VaultAPR({ }; } - const client = getClient(chainId, customRpcUrl); - const currentBlock = await client.getBlockNumber(); - const currentBlockData = await client.getBlock({ blockNumber: currentBlock }); - const currentTimestamp = Number(currentBlockData.timestamp); - - const targetTimestamp = currentTimestamp - lookbackDays * SECONDS_PER_DAY; - // WHY: estimate a historical block close to the target window, then annualize using real block timestamps. - const estimatedPastBlock = estimateBlockAtTimestamp(chainId, targetTimestamp, Number(currentBlock), currentTimestamp); - const pastBlockData = await client.getBlock({ blockNumber: BigInt(estimatedPastBlock) }); - const pastTimestamp = Number(pastBlockData.timestamp); - const periodSeconds = currentTimestamp - pastTimestamp; - - if (periodSeconds <= 0) { + const vaultYieldSnapshots = await fetchVaultYieldSnapshots({ + vaults: [{ address: vaultAddress, networkId: chainId }], + lookbackDays, + customRpcUrls: { [chainId]: customRpcUrl }, + throwOnFailure: true, + }); + const vaultYieldSnapshot = vaultYieldSnapshots.get(getVaultReadKey(vaultAddress, chainId)) ?? null; + + if (!vaultYieldSnapshot?.currentBlock || !vaultYieldSnapshot.pastBlock || !vaultYieldSnapshot.periodSeconds) { return { - vaultApy3d: null, + vaultApy3d: vaultYieldSnapshot?.vaultApy ?? null, borrowApy3d: null, - sharePriceNow: null, - periodSeconds: null, + sharePriceNow: vaultYieldSnapshot?.sharePriceNow ?? null, + periodSeconds: vaultYieldSnapshot?.periodSeconds ?? null, }; } + const client = getClient(chainId, customRpcUrl); const morphoAddress = getMorphoAddress(chainId); - const contracts = [ - { - address: vaultAddress, - abi: erc4626Abi, - functionName: 'previewRedeem' as const, - args: [oneShareUnit] as const, - }, - { - address: morphoAddress as Address, - abi: morphoAbi, - functionName: 'market' as const, - args: [market.uniqueKey as Hex] as const, - }, - ] as const; - - const currentResults = await client.multicall({ - contracts, + const currentMarketResults = await client.multicall({ + contracts: [ + { + address: morphoAddress as Address, + abi: morphoAbi, + functionName: 'market' as const, + args: [market.uniqueKey as Hex] as const, + }, + ], allowFailure: true, + blockNumber: vaultYieldSnapshot.currentBlock, }); - let pastResults: typeof currentResults | null = null; + let pastMarketResults: typeof currentMarketResults | null = null; try { - pastResults = await client.multicall({ - contracts, + pastMarketResults = await client.multicall({ + contracts: [ + { + address: morphoAddress as Address, + abi: morphoAbi, + functionName: 'market' as const, + args: [market.uniqueKey as Hex] as const, + }, + ], allowFailure: true, - blockNumber: BigInt(estimatedPastBlock), + blockNumber: vaultYieldSnapshot.pastBlock, }); } catch { // Some RPCs are non-archive and cannot serve historical eth_call at past blocks. - pastResults = null; + pastMarketResults = null; } - const currentSharePrice = - currentResults[0].status === 'success' && typeof currentResults[0].result === 'bigint' ? currentResults[0].result : null; - const currentBorrowIndex = currentResults[1].status === 'success' ? readBorrowIndex(asBigIntArray(currentResults[1].result)) : null; - - const pastSharePrice = - pastResults?.[0]?.status === 'success' && typeof pastResults[0].result === 'bigint' ? pastResults[0].result : null; - const pastBorrowIndex = pastResults?.[1]?.status === 'success' ? readBorrowIndex(asBigIntArray(pastResults[1].result)) : null; - - const vaultApy3d = - currentSharePrice && pastSharePrice - ? computeAnnualizedApyFromGrowth({ - currentValue: currentSharePrice, - pastValue: pastSharePrice, - periodSeconds, - }) - : null; + const currentBorrowIndex = + currentMarketResults[0].status === 'success' ? readBorrowIndex(asBigIntArray(currentMarketResults[0].result)) : null; + const pastBorrowIndex = + pastMarketResults?.[0]?.status === 'success' ? readBorrowIndex(asBigIntArray(pastMarketResults[0].result)) : null; const borrowApy3d = currentBorrowIndex && pastBorrowIndex ? computeAnnualizedApyFromGrowth({ currentValue: currentBorrowIndex, pastValue: pastBorrowIndex, - periodSeconds, + periodSeconds: vaultYieldSnapshot.periodSeconds, }) : null; return { - vaultApy3d, + vaultApy3d: vaultYieldSnapshot.vaultApy, borrowApy3d, - sharePriceNow: currentSharePrice, - periodSeconds, + sharePriceNow: vaultYieldSnapshot.sharePriceNow, + periodSeconds: vaultYieldSnapshot.periodSeconds, }; }, }); diff --git a/src/hooks/useAllMarketPositions.ts b/src/hooks/useAllMarketPositions.ts index ebf487f6..939274a6 100644 --- a/src/hooks/useAllMarketPositions.ts +++ b/src/hooks/useAllMarketPositions.ts @@ -1,11 +1,12 @@ import { useQuery } from '@tanstack/react-query'; import { supportsMorphoApi } from '@/config/dataSources'; +import { fetchMonarchMarketBorrowers, fetchMonarchMarketSuppliers } from '@/data-sources/monarch-api'; import { fetchMorphoMarketBorrowers } from '@/data-sources/morpho-api/market-borrowers'; import { fetchMorphoMarketSuppliers } from '@/data-sources/morpho-api/market-suppliers'; import { fetchSubgraphMarketBorrowers } from '@/data-sources/subgraph/market-borrowers'; import { fetchSubgraphMarketSuppliers } from '@/data-sources/subgraph/market-suppliers'; import type { SupportedNetworks } from '@/utils/networks'; -import type { MarketBorrower, MarketSupplier } from '@/utils/types'; +import type { Market, MarketBorrower, MarketSupplier } from '@/utils/types'; const TOP_POSITIONS_LIMIT = 1000; @@ -27,31 +28,33 @@ type UseAllSuppliersResult = { * Fetches top borrowers for chart aggregation (non-paginated). * Retrieves up to 1000 positions sorted by borrow shares descending. */ -export const useAllMarketBorrowers = (marketId: string | undefined, network: SupportedNetworks | undefined): UseAllBorrowersResult => { +export const useAllMarketBorrowers = ( + marketId: string | undefined, + network: SupportedNetworks | undefined, + marketState: Pick | undefined, +): UseAllBorrowersResult => { const { data, isLoading, error } = useQuery({ - queryKey: ['allMarketBorrowers', marketId, network], + queryKey: ['allMarketBorrowers', marketId, network, marketState?.borrowAssets, marketState?.borrowShares], queryFn: async () => { - if (!marketId || !network) return null; + if (!marketId || !network || !marketState) return null; - let result = null; + try { + return await fetchMonarchMarketBorrowers(marketId, Number(network), marketState, '1', TOP_POSITIONS_LIMIT, 0); + } catch { + // Continue to fallback providers. + } - // Try Morpho API first if (supportsMorphoApi(network)) { try { - result = await fetchMorphoMarketBorrowers(marketId, Number(network), '1', TOP_POSITIONS_LIMIT, 0); + return await fetchMorphoMarketBorrowers(marketId, Number(network), '1', TOP_POSITIONS_LIMIT, 0); } catch { - // Morpho API failed, will fall back to subgraph + // Continue to subgraph fallback. } } - // Fallback to Subgraph if Morpho API failed or returned empty - if (!result || result.items?.length === 0) { - result = await fetchSubgraphMarketBorrowers(marketId, network, '1', TOP_POSITIONS_LIMIT, 0); - } - - return result; + return fetchSubgraphMarketBorrowers(marketId, network, '1', TOP_POSITIONS_LIMIT, 0); }, - enabled: !!marketId && !!network, + enabled: !!marketId && !!network && !!marketState, staleTime: 1000 * 60 * 2, // 2 minutes }); @@ -73,23 +76,21 @@ export const useAllMarketSuppliers = (marketId: string | undefined, network: Sup queryFn: async () => { if (!marketId || !network) return null; - let result = null; + try { + return await fetchMonarchMarketSuppliers(marketId, Number(network), '1', TOP_POSITIONS_LIMIT, 0); + } catch { + // Continue to fallback providers. + } - // Try Morpho API first if (supportsMorphoApi(network)) { try { - result = await fetchMorphoMarketSuppliers(marketId, Number(network), '1', TOP_POSITIONS_LIMIT, 0); + return await fetchMorphoMarketSuppliers(marketId, Number(network), '1', TOP_POSITIONS_LIMIT, 0); } catch { - // Morpho API failed, will fall back to subgraph + // Continue to subgraph fallback. } } - // Fallback to Subgraph if Morpho API failed or returned empty - if (!result || result.items?.length === 0) { - result = await fetchSubgraphMarketSuppliers(marketId, network, '1', TOP_POSITIONS_LIMIT, 0); - } - - return result; + return fetchSubgraphMarketSuppliers(marketId, network, '1', TOP_POSITIONS_LIMIT, 0); }, enabled: !!marketId && !!network, staleTime: 1000 * 60 * 2, // 2 minutes diff --git a/src/hooks/useUserPosition.ts b/src/hooks/useUserPosition.ts index 40b81eae..3f2f4ee4 100644 --- a/src/hooks/useUserPosition.ts +++ b/src/hooks/useUserPosition.ts @@ -2,18 +2,57 @@ import { useQuery } from '@tanstack/react-query'; import type { Address } from 'viem'; import { usePublicClient } from 'wagmi'; import { supportsMorphoApi } from '@/config/dataSources'; +import { fetchMonarchUserPositionStateForMarket } from '@/data-sources/monarch-api'; import { fetchMorphoUserPositionForMarket } from '@/data-sources/morpho-api/positions'; import { fetchSubgraphUserPositionForMarket } from '@/data-sources/subgraph/positions'; import type { SupportedNetworks } from '@/utils/networks'; -import { fetchPositionSnapshot } from '@/utils/positions'; +import { convertSharesToAssets, fetchPositionSnapshot } from '@/utils/positions'; import type { MarketPosition } from '@/utils/types'; import { useProcessedMarkets } from './useProcessedMarkets'; +type SnapshotState = NonNullable>>; + +const buildPositionStateFromSnapshot = (snapshot: SnapshotState): MarketPosition['state'] => ({ + supplyAssets: snapshot.supplyAssets.toString(), + supplyShares: snapshot.supplyShares.toString(), + borrowAssets: snapshot.borrowAssets.toString(), + borrowShares: snapshot.borrowShares.toString(), + collateral: snapshot.collateral, +}); + +const buildPositionFromLiveMarket = ( + market: MarketPosition['market'], + state: Pick, +): MarketPosition => { + const supplyAssets = convertSharesToAssets( + BigInt(state.supplyShares), + BigInt(market.state.supplyAssets), + BigInt(market.state.supplyShares), + ).toString(); + const borrowAssets = convertSharesToAssets( + BigInt(state.borrowShares), + BigInt(market.state.borrowAssets), + BigInt(market.state.borrowShares), + ).toString(); + + return { + market, + state: { + supplyShares: state.supplyShares, + supplyAssets, + borrowShares: state.borrowShares, + borrowAssets, + collateral: state.collateral, + }, + }; +}; + /** * Hook to fetch a user's position in a specific market. * * Prioritizes the latest on-chain snapshot via `fetchPositionSnapshot`. - * Falls back to the configured data source (Morpho API or Subgraph) if the snapshot is unavailable. + * If the snapshot is unavailable and local market metadata is present, it tries Monarch position state first, + * then falls back to Morpho API or Subgraph for full market-backed reconstruction. * * @param user The user's address. * @param chainId The network ID. @@ -45,7 +84,8 @@ const useUserPosition = (user: string | undefined, chainId: SupportedNetworks | return null; } - // 1. Try fetching the on-chain snapshot first + const localMarket = markets?.find((m) => m.uniqueKey.toLowerCase() === marketKey.toLowerCase()); + console.log(`Attempting fetchPositionSnapshot for ${user} on market ${marketKey}`); let snapshot = null; try { @@ -59,22 +99,11 @@ const useUserPosition = (user: string | undefined, chainId: SupportedNetworks | let finalPosition: MarketPosition | null = null; if (snapshot) { - // Snapshot succeeded, try to use local market data first - const market = markets?.find((m) => m.uniqueKey.toLowerCase() === marketKey.toLowerCase()); - - if (market) { - // Local market data found, construct position directly + if (localMarket) { console.log(`Found local market data for ${marketKey}, constructing position from snapshot.`); finalPosition = { - market: market, - state: { - // Add state from snapshot - supplyAssets: snapshot.supplyAssets.toString(), - supplyShares: snapshot.supplyShares.toString(), - borrowAssets: snapshot.borrowAssets.toString(), - borrowShares: snapshot.borrowShares.toString(), - collateral: snapshot.collateral, - }, + market: localMarket, + state: buildPositionStateFromSnapshot(snapshot), }; } else { // Local market data NOT found, need to fetch from fallback to get structure @@ -107,13 +136,7 @@ const useUserPosition = (user: string | undefined, chainId: SupportedNetworks | // Fallback succeeded, combine with snapshot state finalPosition = { ...fallbackPosition, - state: { - supplyAssets: snapshot.supplyAssets.toString(), - supplyShares: snapshot.supplyShares.toString(), - borrowAssets: snapshot.borrowAssets.toString(), - borrowShares: snapshot.borrowShares.toString(), - collateral: snapshot.collateral, - }, + state: buildPositionStateFromSnapshot(snapshot), }; } else { // Fallback failed even though snapshot existed @@ -122,7 +145,16 @@ const useUserPosition = (user: string | undefined, chainId: SupportedNetworks | } } } else { - // Snapshot failed, rely entirely on the fallback data source + if (localMarket) { + try { + console.log(`Attempting to fetch position via Monarch for ${marketKey}`); + const monarchPositionState = await fetchMonarchUserPositionStateForMarket(marketKey, user, chainId); + return monarchPositionState ? buildPositionFromLiveMarket(localMarket, monarchPositionState) : null; + } catch (monarchError) { + console.error('Failed to fetch position via Monarch:', monarchError); + } + } + console.log(`Snapshot failed for ${marketKey}, fetching from fallback source.`); // Try Morpho API first if supported diff --git a/src/hooks/useUserPositions.ts b/src/hooks/useUserPositions.ts index 1bb9e649..dc0b2259 100644 --- a/src/hooks/useUserPositions.ts +++ b/src/hooks/useUserPositions.ts @@ -2,6 +2,7 @@ import { useCallback } from 'react'; import { useQuery, useQueryClient } from '@tanstack/react-query'; import type { Address } from 'viem'; import { supportsMorphoApi } from '@/config/dataSources'; +import { fetchMonarchUserPositionMarketsForNetworks } from '@/data-sources/monarch-api'; import { fetchMorphoUserPositionMarkets, fetchMorphoUserPositionMarketsForNetworks } from '@/data-sources/morpho-api/positions'; import { fetchSubgraphUserPositionMarkets } from '@/data-sources/subgraph/positions'; import { ALL_SUPPORTED_NETWORKS, type SupportedNetworks } from '@/utils/networks'; @@ -124,6 +125,13 @@ const appendFulfilledPositionMarkets = ( // Fetches market keys ONLY from API/Subgraph sources const fetchSourceMarketKeys = async (user: string, chainIds?: SupportedNetworks[]): Promise => { const networksToFetch = chainIds ?? ALL_SUPPORTED_NETWORKS; + + try { + return await fetchMonarchUserPositionMarketsForNetworks(user, networksToFetch); + } catch (error) { + console.error('[Positions] Failed batched Monarch position lookup, falling back to Morpho/subgraph strategy:', error); + } + const morphoApiNetworks = networksToFetch.filter((network) => supportsMorphoApi(network)); const fallbackNetworks = networksToFetch.filter((network) => !supportsMorphoApi(network)); const sourcePositionMarkets: PositionMarket[] = []; diff --git a/src/utils/positions.ts b/src/utils/positions.ts index 3dbec8f4..39ede40c 100644 --- a/src/utils/positions.ts +++ b/src/utils/positions.ts @@ -60,6 +60,8 @@ export type BorrowPositionRow = { isActiveDebt: boolean; }; +const ONE_YEAR_IN_SECONDS = 86_400 * 365; + function normalizeOraclePriceResult(value: unknown): string | null { if (typeof value === 'bigint' || typeof value === 'number' || typeof value === 'string') { return value.toString(); @@ -377,35 +379,53 @@ export function getGroupedEarnings(groupedPosition: GroupedPosition): bigint { } /** - * Get weighted actual APY for a group of positions - * Weighted by capital-time contribution from each position (avgCapital * effectiveTime) + * Get grouped actual APY for a group of positions. + * Aggregate earnings and capital-time first, then annualize once at the group level. * * @param groupedPosition - The grouped position - * @returns The weighted actual APY as a number + * @param chainBlockData - Period start block/timestamp keyed by chain ID + * @param endTimestamp - Period end timestamp + * @returns The grouped actual APY as a number */ -export function getGroupedActualApy(groupedPosition: GroupedPosition): number { - let totalWeightedApy = 0; +export function getGroupedActualApy( + groupedPosition: GroupedPosition, + chainBlockData: Record, + endTimestamp: number = Math.floor(Date.now() / 1000), +): number { + const startTimestamp = chainBlockData[groupedPosition.chainId]?.timestamp; + if (!startTimestamp || endTimestamp <= startTimestamp) return 0; + + const fullWindowSeconds = endTimestamp - startTimestamp; + let totalEarned = 0n; let totalCapitalTime = 0n; for (const position of groupedPosition.markets) { const avgCapital = BigInt(position.avgCapital ?? '0'); const effectiveTime = BigInt(Math.max(0, position.effectiveTime ?? 0)); - if (avgCapital <= 0n || effectiveTime <= 0n) continue; - if (!Number.isFinite(position.actualApy)) continue; - const capitalTime = avgCapital * effectiveTime; - const capitalTimeAsNumber = Number(capitalTime); - if (!Number.isFinite(capitalTimeAsNumber) || capitalTimeAsNumber <= 0) continue; + if (capitalTime <= 0n) continue; - totalWeightedApy += capitalTimeAsNumber * position.actualApy; + totalEarned += BigInt(position.earned ?? '0'); totalCapitalTime += capitalTime; } - if (totalCapitalTime <= 0n) return 0; - const totalCapitalTimeAsNumber = Number(totalCapitalTime); - if (!Number.isFinite(totalCapitalTimeAsNumber) || totalCapitalTimeAsNumber <= 0) return 0; + if (totalCapitalTime <= 0n || totalEarned <= 0n) return 0; + + const averageCapital = totalCapitalTime / BigInt(fullWindowSeconds); + if (averageCapital <= 0n) return 0; + + const earnedAsNumber = Number(formatUnits(totalEarned, groupedPosition.loanAssetDecimals)); + const averageCapitalAsNumber = Number(formatUnits(averageCapital, groupedPosition.loanAssetDecimals)); + if (!Number.isFinite(earnedAsNumber) || !Number.isFinite(averageCapitalAsNumber) || averageCapitalAsNumber <= 0) return 0; + + const periods = ONE_YEAR_IN_SECONDS / fullWindowSeconds; + const base = earnedAsNumber / averageCapitalAsNumber + 1; + + if (!Number.isFinite(periods) || periods <= 0 || periods > 1_000_000) return 0; + if (!Number.isFinite(base) || base <= 0) return 0; - return totalWeightedApy / totalCapitalTimeAsNumber; + const annualized = base ** periods - 1; + return Number.isFinite(annualized) ? annualized : 0; } /** @@ -414,7 +434,10 @@ export function getGroupedActualApy(groupedPosition: GroupedPosition): number { * @param positions - Array of positions with earnings * @returns Array of grouped positions */ -export function groupPositionsByLoanAsset(positions: MarketPositionWithEarnings[]): GroupedPosition[] { +export function groupPositionsByLoanAsset( + positions: MarketPositionWithEarnings[], + chainBlockData: Record, +): GroupedPosition[] { return positions .filter((position) => BigInt(position.state.supplyShares) > 0) .reduce((acc: GroupedPosition[], position) => { @@ -481,7 +504,7 @@ export function groupPositionsByLoanAsset(positions: MarketPositionWithEarnings[ groupedPosition.totalWeightedApy = 0; // Avoid division by zero } // Calculate weighted actual APY across markets - groupedPosition.actualApy = getGroupedActualApy(groupedPosition); + groupedPosition.actualApy = getGroupedActualApy(groupedPosition, chainBlockData); return groupedPosition; }) .sort((a, b) => b.totalSupply - a.totalSupply); diff --git a/src/utils/types.ts b/src/utils/types.ts index bcd2a8a3..aa1e1615 100644 --- a/src/utils/types.ts +++ b/src/utils/types.ts @@ -41,6 +41,7 @@ export enum UserTxTypes { } export type UserTransaction = { + id?: string; hash: string; timestamp: number; type: UserTxTypes; @@ -215,7 +216,7 @@ export type GroupedPosition = { loanAssetSymbol: string; totalSupply: number; totalWeightedApy: number; - actualApy: number; // Weighted historical APY across all markets + actualApy: number; // Grouped historical APY across all markets earned?: PositionEarnings; diff --git a/src/utils/user-transaction-history-cache.ts b/src/utils/user-transaction-history-cache.ts index 8147898a..0a5f75ad 100644 --- a/src/utils/user-transaction-history-cache.ts +++ b/src/utils/user-transaction-history-cache.ts @@ -3,6 +3,7 @@ import morphoAbi from '@/abis/morpho'; import { getMorphoAddress } from '@/utils/morpho'; import type { SupportedNetworks } from '@/utils/networks'; import { type UserTransaction, UserTxTypes } from '@/utils/types'; +import { getUserTransactionIdentity, getUserTransactionMergeKey, sortUserTransactions } from '@/utils/user-transactions'; const CACHE_KEY = 'monarch_cache_userTransactionHistory_v1'; const CACHE_TTL_MS = 5 * 60 * 1000; @@ -30,13 +31,6 @@ type CachedUserTransactionEntry = { const normalizeAddress = (address: string): Address => address.toLowerCase() as Address; -const getTransactionDedupKey = (transaction: UserTransaction): string => { - const marketKey = transaction.data?.market?.uniqueKey?.toLowerCase() ?? ''; - const assets = transaction.data?.assets ?? '0'; - const shares = transaction.data?.shares ?? '0'; - return `${transaction.hash.toLowerCase()}:${transaction.type}:${marketKey}:${assets}:${shares}`; -}; - const getCacheEntryDedupKey = (entry: CachedUserTransactionEntry): string => `${entry.chainId}:${entry.userAddress}:${entry.tx.hash.toLowerCase()}:${entry.logIndex}`; @@ -158,6 +152,7 @@ export function cacheUserTransactionHistoryFromReceipt({ expiresAt, logIndex: log.logIndex ?? index, tx: { + id: `${chainId}:${txHash.toLowerCase()}:${log.logIndex ?? index}`, hash: txHash, timestamp, type: txType, @@ -213,7 +208,7 @@ export function mergeUserTransactionsWithRecentCache({ const normalizedUser = normalizeAddress(userAddress); const chainIdSet = new Set(chainIds); - const apiHashes = new Set(apiTransactions.map((tx) => tx.hash.toLowerCase())); + const apiTransactionKeys = new Set(apiTransactions.map(getUserTransactionMergeKey)); const activeEntries = getActiveCacheEntries(); if (activeEntries.length === 0) { @@ -225,7 +220,7 @@ export function mergeUserTransactionsWithRecentCache({ if (entry.userAddress !== normalizedUser || !chainIdSet.has(entry.chainId)) { return false; } - return !apiHashes.has(entry.tx.hash.toLowerCase()); + return !apiTransactionKeys.has(getUserTransactionMergeKey(entry.tx)); }) .map((entry) => entry.tx); @@ -237,14 +232,13 @@ export function mergeUserTransactionsWithRecentCache({ const seen = new Set(); for (const tx of [...apiTransactions, ...cachedTransactions]) { - const key = getTransactionDedupKey(tx); + const key = getUserTransactionIdentity(tx); if (seen.has(key)) continue; seen.add(key); deduped.push(tx); } - deduped.sort((a, b) => b.timestamp - a.timestamp); - return deduped; + return sortUserTransactions(deduped); } export function reconcileUserTransactionHistoryCache({ @@ -262,7 +256,7 @@ export function reconcileUserTransactionHistoryCache({ const normalizedUser = normalizeAddress(userAddress); const chainIdSet = new Set(chainIds); - const apiHashes = new Set(apiTransactions.map((tx) => tx.hash.toLowerCase())); + const apiTransactionKeys = new Set(apiTransactions.map(getUserTransactionMergeKey)); const activeEntries = readAndPruneCacheEntries(); if (activeEntries.length === 0) return; @@ -271,7 +265,7 @@ export function reconcileUserTransactionHistoryCache({ const isRelevantEntry = entry.userAddress === normalizedUser && chainIdSet.has(entry.chainId); if (!isRelevantEntry) return true; - const shouldKeep = !apiHashes.has(entry.tx.hash.toLowerCase()); + const shouldKeep = !apiTransactionKeys.has(getUserTransactionMergeKey(entry.tx)); return shouldKeep; }); diff --git a/src/utils/user-transactions.ts b/src/utils/user-transactions.ts new file mode 100644 index 00000000..0031912f --- /dev/null +++ b/src/utils/user-transactions.ts @@ -0,0 +1,82 @@ +import type { UserTransaction } from './types'; + +export type TransactionFilters = { + userAddress: string[]; + chainId: number; + marketUniqueKeys?: string[]; + timestampGte?: number; + timestampLte?: number; + skip?: number; + first?: number; + hash?: string; + assetIds?: string[]; +}; + +export type TransactionResponse = { + items: UserTransaction[]; + pageInfo: { + count: number; + countTotal: number; + }; + error: string | null; +}; + +export const emptyTransactionResponse = (error: string | null = null): TransactionResponse => ({ + items: [], + pageInfo: { + count: 0, + countTotal: 0, + }, + error, +}); + +export const getUserTransactionIdentity = (transaction: UserTransaction): string => { + if (transaction.id) { + return transaction.id.toLowerCase(); + } + + return `${getUserTransactionMergeKey(transaction)}:${transaction.timestamp}`; +}; + +export const getUserTransactionMergeKey = (transaction: UserTransaction): string => { + const marketKey = transaction.data?.market?.uniqueKey?.toLowerCase() ?? ''; + const assets = transaction.data?.assets ?? '0'; + const shares = transaction.data?.shares ?? '0'; + + return `${transaction.hash.toLowerCase()}:${transaction.type}:${marketKey}:${assets}:${shares}`; +}; + +export const compareUserTransactions = (a: UserTransaction, b: UserTransaction): number => { + if (b.timestamp !== a.timestamp) { + return b.timestamp - a.timestamp; + } + + const hashCompare = b.hash.localeCompare(a.hash); + if (hashCompare !== 0) { + return hashCompare; + } + + return getUserTransactionIdentity(b).localeCompare(getUserTransactionIdentity(a)); +}; + +export const sortUserTransactions = (transactions: UserTransaction[]): UserTransaction[] => { + transactions.sort(compareUserTransactions); + return transactions; +}; + +export const dedupeUserTransactions = (transactions: UserTransaction[]): UserTransaction[] => { + const deduped: UserTransaction[] = []; + const seen = new Set(); + + for (const transaction of transactions) { + const identity = getUserTransactionIdentity(transaction); + if (seen.has(identity)) { + continue; + } + + seen.add(identity); + deduped.push(transaction); + } + + return deduped; +}; diff --git a/src/utils/vaultYield.ts b/src/utils/vaultYield.ts new file mode 100644 index 00000000..55ce4155 --- /dev/null +++ b/src/utils/vaultYield.ts @@ -0,0 +1,195 @@ +import type { Address } from 'viem'; +import { erc4626Abi } from '@/abis/erc4626'; +import { computeAnnualizedApyFromGrowth } from '@/hooks/leverage/math'; +import { estimateBlockAtTimestamp } from '@/utils/blockEstimation'; +import type { SupportedNetworks } from '@/utils/networks'; +import { getClient } from '@/utils/rpc'; +import { getVaultReadKey } from '@/utils/vaultAllocation'; + +export const DEFAULT_VAULT_APY_LOOKBACK_DAYS = 3; +const SECONDS_PER_DAY = 24 * 60 * 60; + +export type VaultYieldRequest = { + address: Address; + networkId: SupportedNetworks; +}; + +export type VaultYieldSnapshot = { + vaultApy: number | null; + sharePriceNow: bigint | null; + periodSeconds: number | null; + currentBlock: bigint | null; + pastBlock: bigint | null; +}; + +const buildNullSnapshot = (): VaultYieldSnapshot => ({ + vaultApy: null, + sharePriceNow: null, + periodSeconds: null, + currentBlock: null, + pastBlock: null, +}); + +const setNetworkResults = ( + results: Map, + vaults: VaultYieldRequest[], + networkId: SupportedNetworks, + snapshotFactory: () => VaultYieldSnapshot, +): void => { + for (const vault of vaults) { + results.set(getVaultReadKey(vault.address, networkId), snapshotFactory()); + } +}; + +export async function fetchVaultYieldSnapshots({ + vaults, + lookbackDays = DEFAULT_VAULT_APY_LOOKBACK_DAYS, + customRpcUrls, + throwOnFailure = false, +}: { + vaults: VaultYieldRequest[]; + lookbackDays?: number; + customRpcUrls?: Partial>; + throwOnFailure?: boolean; +}): Promise> { + const results = new Map(); + + if (vaults.length === 0) { + return results; + } + + const vaultsByNetwork = vaults.reduce( + (acc, vault) => { + const existing = acc[vault.networkId] ?? []; + existing.push(vault); + acc[vault.networkId] = existing; + return acc; + }, + {} as Record, + ); + + await Promise.all( + Object.entries(vaultsByNetwork).map(async ([networkIdValue, networkVaults]) => { + const networkId = Number(networkIdValue) as SupportedNetworks; + + try { + const client = getClient(networkId, customRpcUrls?.[networkId]); + const currentBlock = await client.getBlockNumber(); + const currentBlockData = await client.getBlock({ blockNumber: currentBlock }); + const currentTimestamp = Number(currentBlockData.timestamp); + + const targetTimestamp = currentTimestamp - lookbackDays * SECONDS_PER_DAY; + const estimatedPastBlock = BigInt(estimateBlockAtTimestamp(networkId, targetTimestamp, Number(currentBlock), currentTimestamp)); + const pastBlockData = await client.getBlock({ blockNumber: estimatedPastBlock }); + const pastTimestamp = Number(pastBlockData.timestamp); + const periodSeconds = currentTimestamp - pastTimestamp; + + if (periodSeconds <= 0) { + setNetworkResults(results, networkVaults, networkId, buildNullSnapshot); + return; + } + + const decimalsResults = await client.multicall({ + contracts: networkVaults.map((vault) => ({ + address: vault.address, + abi: erc4626Abi, + functionName: 'decimals' as const, + args: [], + })), + allowFailure: true, + }); + + const previewableVaults = networkVaults + .map((vault, index) => { + const decimalsResult = decimalsResults[index]; + if (decimalsResult?.status !== 'success' || typeof decimalsResult.result !== 'number') { + return null; + } + + return { + ...vault, + oneShareUnit: 10n ** BigInt(decimalsResult.result), + }; + }) + .filter((vault): vault is VaultYieldRequest & { oneShareUnit: bigint } => vault !== null); + + const baseSnapshot = { + currentBlock, + pastBlock: estimatedPastBlock, + periodSeconds, + } satisfies Pick; + + setNetworkResults(results, networkVaults, networkId, () => ({ + ...buildNullSnapshot(), + ...baseSnapshot, + })); + + if (previewableVaults.length === 0) { + return; + } + + const previewContracts = previewableVaults.map((vault) => ({ + address: vault.address, + abi: erc4626Abi, + functionName: 'previewRedeem' as const, + args: [vault.oneShareUnit] as const, + })); + + const currentPreviewResults = await client.multicall({ + contracts: previewContracts, + allowFailure: true, + blockNumber: currentBlock, + }); + + let pastPreviewResults: typeof currentPreviewResults | null = null; + try { + pastPreviewResults = await client.multicall({ + contracts: previewContracts, + allowFailure: true, + blockNumber: estimatedPastBlock, + }); + } catch { + // Some RPCs do not support archive eth_call on historical blocks. + pastPreviewResults = null; + } + + previewableVaults.forEach((vault, index) => { + const currentPreviewResult = currentPreviewResults[index]; + const sharePriceNow = + currentPreviewResult?.status === 'success' && typeof currentPreviewResult.result === 'bigint' + ? currentPreviewResult.result + : null; + const pastPreviewResult = pastPreviewResults?.[index]; + const pastSharePrice = + pastPreviewResult?.status === 'success' && typeof pastPreviewResult.result === 'bigint' ? pastPreviewResult.result : null; + + const vaultApy = + sharePriceNow && pastSharePrice + ? computeAnnualizedApyFromGrowth({ + currentValue: sharePriceNow, + pastValue: pastSharePrice, + periodSeconds, + }) + : null; + + results.set(getVaultReadKey(vault.address, networkId), { + vaultApy, + sharePriceNow, + periodSeconds, + currentBlock, + pastBlock: estimatedPastBlock, + }); + }); + } catch (error) { + if (throwOnFailure) { + throw error; + } + + console.warn(`[vaultYield] Failed to fetch vault yield snapshots for chain ${networkId}:`, error); + setNetworkResults(results, networkVaults, networkId, buildNullSnapshot); + } + }), + ); + + return results; +}